SOTAVerified

Easy Transfer Learning By Exploiting Intra-domain Structures

2019-04-02Code Available0· sign in to hype

Jindong Wang, Yiqiang Chen, Han Yu, Meiyu Huang, Qiang Yang

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Transfer learning aims at transferring knowledge from a well-labeled domain to a similar but different domain with limited or no labels. Unfortunately, existing learning-based methods often involve intensive model selection and hyperparameter tuning to obtain good results. Moreover, cross-validation is not possible for tuning hyperparameters since there are often no labels in the target domain. This would restrict wide applicability of transfer learning especially in computationally-constraint devices such as wearables. In this paper, we propose a practically Easy Transfer Learning (EasyTL) approach which requires no model selection and hyperparameter tuning, while achieving competitive performance. By exploiting intra-domain structures, EasyTL is able to learn both non-parametric transfer features and classifiers. Extensive experiments demonstrate that, compared to state-of-the-art traditional and deep methods, EasyTL satisfies the Occam's Razor principle: it is extremely easy to implement and use while achieving comparable or better performance in classification accuracy and much better computational efficiency. Additionally, it is shown that EasyTL can increase the performance of existing transfer feature learning methods.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
ImageCLEF-DAEasyTLAccuracy88.2Unverified

Reproductions