Multi-Source Domain Adaptation with Mixture of Experts
2018-09-07EMNLP 2018Code Available0· sign in to hype
Jiang Guo, Darsh J Shah, Regina Barzilay
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/jiangfeng1124/transferOfficialIn paperpytorch★ 0
Abstract
We propose a mixture-of-experts approach for unsupervised domain adaptation from multiple sources. The key idea is to explicitly capture the relationship between a target example and different source domains. This relationship, expressed by a point-to-set metric, determines how to combine predictors trained on various domains. The metric is learned in an unsupervised fashion using meta-training. Experimental results on sentiment analysis and part-of-speech tagging demonstrate that our approach consistently outperforms multiple baselines and can robustly handle negative transfer.