Hierarchical Unsupervised Relation Distillation for Source Free Domain Adaptation

Bowei Xing*, Xianghua Ying, Ruibin Wang, Ruohao Guo, Ji Shi, Wenzhen Yue ;

Abstract


"Source free domain adaptation (SFDA) aims to transfer the model trained on labeled source domain to unlabeled target domain without accessing source data. Recent SFDA methods predominantly rely on self-training, which supervise the model with pseudo labels generated from individual data samples. However, they often ignore the crucial data structure information and sample relationships that are beneficial for adaptive training. In this paper, we propose a novel hierarchical relation distillation framework, establishing multi-level relations across samples in an unsupervised manner, which fully exploits inherent data structure to guide sample training instead of using isolated pseudo labels. We first distinguish source-like samples based on prediction reliability during training, followed by an effort on distilling knowledge to those target-specific ones by transferring both local clustering relation and global semantic relation. Specifically, we leverage the affinity with nearest neighborhood samples for local relation and consider the similarity to category-wise Gaussian Mixtures for global relation, offering complementary supervision to facilitate student learning. To validate our approach’s effectiveness, we conduct extensive experiments on diverse benchmarks, achieving better performance compared to previous methods."

Related Material


[pdf] [DOI]