Noise Optimized Conditional Diffusion for Domain Adaptation
Lingkun Luo, Shiqiang Hu, Liming Chen
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
Pseudo-labeling is a cornerstone of Unsupervised Domain Adaptation (UDA), yet the scarcity of High-Confidence Pseudo-Labeled Target Domain Samples (hcpl-tds) often leads to inaccurate cross-domain statistical alignment, causing DA failures. To address this challenge, we propose Noise Optimized Conditional Diffusion for Domain Adaptation (NOCDDA), which seamlessly integrates the generative capabilities of conditional diffusion models with the decision-making requirements of DA to achieve task-coupled optimization for efficient adaptation. For robust cross-domain consistency, we modify the DA classifier to align with the conditional diffusion classifier within a unified optimization framework, enabling forward training on noise-varying cross-domain samples. Furthermore, we argue that the conventional \( N(0, I) \) initialization in diffusion models often generates class-confused hcpl-tds, compromising discriminative DA. To resolve this, we introduce a class-aware noise optimization strategy that refines sampling regions for reverse class-specific hcpl-tds generation, effectively enhancing cross-domain alignment. Extensive experiments across 5 benchmark datasets and 29 DA tasks demonstrate significant performance gains of NOCDDA over 31 state-of-the-art methods, validating its robustness and effectiveness.