SOTAVerified

Gradual Domain Adaptation via Manifold-Constrained Distributionally Robust Optimization

2024-10-17Unverified0· sign in to hype

Amir Hossein Saberi, Amir Najafi, Ala Emrani, Amin Behjati, Yasaman Zolfimoselo, Mahdi Shadrooy, Abolfazl Motahari, Babak H. Khalaj

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

The aim of this paper is to address the challenge of gradual domain adaptation within a class of manifold-constrained data distributions. In particular, we consider a sequence of T2 data distributions P_1,,P_T undergoing a gradual shift, where each pair of consecutive measures P_i,P_i+1 are close to each other in Wasserstein distance. We have a supervised dataset of size n sampled from P_0, while for the subsequent distributions in the sequence, only unlabeled i.i.d. samples are available. Moreover, we assume that all distributions exhibit a known favorable attribute, such as (but not limited to) having intra-class soft/hard margins. In this context, we propose a methodology rooted in Distributionally Robust Optimization (DRO) with an adaptive Wasserstein radius. We theoretically show that this method guarantees the classification error across all P_is can be suitably bounded. Our bounds rely on a newly introduced compatibility measure, which fully characterizes the error propagation dynamics along the sequence. Specifically, for inadequately constrained distributions, the error can exponentially escalate as we progress through the gradual shifts. Conversely, for appropriately constrained distributions, the error can be demonstrated to be linear or even entirely eradicated. We have substantiated our theoretical findings through several experimental results.

Tasks

Reproductions