TY - JOUR
T1 - Adaptive Contrastive Learning with Label Consistency for Source Data Free Unsupervised Domain Adaptation
AU - Zhao, Xuejun
AU - Stanislawski, Rafal
AU - Gardoni, Paolo
AU - Sulowicz, Maciej
AU - Glowacz, Adam
AU - Krolczyk, Grzegorz
AU - Li, Zhixiong
N1 - Funding Information:
Funding: The research leading to these results has received funding from the Norway Grants 2014– 2021 operated by National Science Centre under Project Contract No 2020/37/K/ST8/02748.
Publisher Copyright:
© 2022 by the authors. Licensee MDPI, Basel, Switzerland.
PY - 2022/6/1
Y1 - 2022/6/1
N2 - Unsupervised domain adaptation, which aims to alleviate the domain shift between source domain and target domain, has attracted extensive research interest; however, this is unlikely in practical application scenarios, which may be due to privacy issues and intellectual rights. In this paper, we discuss a more challenging and practical source-free unsupervised domain adaptation, which needs to adapt the source domain model to the target domain without the aid of source domain data. We propose label consistent contrastive learning (LCCL), an adaptive contrastive learning framework for source-free unsupervised domain adaptation, which encourages target domain samples to learn class-level discriminative features. Considering that the data in the source domain are unavailable, we introduce the memory bank to store the samples with the same pseudo label output and the samples obtained by clustering, and the trusted historical samples are involved in contrastive learning. In addition, we demonstrate that LCCL is a general framework that can be applied to unsupervised domain adaptation. Extensive experiments on digit recognition and image classification benchmark datasets demonstrate the effectiveness of the proposed method.
AB - Unsupervised domain adaptation, which aims to alleviate the domain shift between source domain and target domain, has attracted extensive research interest; however, this is unlikely in practical application scenarios, which may be due to privacy issues and intellectual rights. In this paper, we discuss a more challenging and practical source-free unsupervised domain adaptation, which needs to adapt the source domain model to the target domain without the aid of source domain data. We propose label consistent contrastive learning (LCCL), an adaptive contrastive learning framework for source-free unsupervised domain adaptation, which encourages target domain samples to learn class-level discriminative features. Considering that the data in the source domain are unavailable, we introduce the memory bank to store the samples with the same pseudo label output and the samples obtained by clustering, and the trusted historical samples are involved in contrastive learning. In addition, we demonstrate that LCCL is a general framework that can be applied to unsupervised domain adaptation. Extensive experiments on digit recognition and image classification benchmark datasets demonstrate the effectiveness of the proposed method.
KW - unsupervised domain adaptation
KW - source free domain adaptation
KW - contrastive learning
UR - http://www.scopus.com/inward/record.url?scp=85131166078&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85131166078&partnerID=8YFLogxK
U2 - 10.3390/s22114238
DO - 10.3390/s22114238
M3 - Article
C2 - 35684857
SN - 1424-8220
VL - 22
JO - Sensors
JF - Sensors
IS - 11
M1 - 4238
ER -