SOTAVerified

Transfer Learning via Minimizing the Performance Gap Between Domains

2019-12-01NeurIPS 2019Code Available0· sign in to hype

Boyu Wang, Jorge Mendez, Mingbo Cai, Eric Eaton

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We propose a new principle for transfer learning, based on a straightforward intuition: if two domains are similar to each other, the model trained on one domain should also perform well on the other domain, and vice versa. To formalize this intuition, we define the performance gap as a measure of the discrepancy between the source and target domains. We derive generalization bounds for the instance weighting approach to transfer learning, showing that the performance gap can be viewed as an algorithm-dependent regularizer, which controls the model complexity. Our theoretical analysis provides new insight into transfer learning and motivates a set of general, principled rules for designing new instance weighting schemes for transfer learning. These rules lead to gapBoost, a novel and principled boosting approach for transfer learning. Our experimental evaluation on benchmark data sets shows that gapBoost significantly outperforms previous boosting-based transfer learning algorithms.

Tasks

Reproductions