SOTAVerified

MATE: Plugging in Model Awareness to Task Embedding for Meta Learning

2020-12-01NeurIPS 2020Code Available0· sign in to hype

Xiaohan Chen, Zhangyang Wang, Siyu Tang, Krikamol Muandet

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Meta-learning improves generalization of machine learning models when faced with previously unseen tasks by leveraging experiences from different, yet related prior tasks. To allow for better generalization, we propose a novel task representation called model-aware task embedding (MATE) that incorporates not only the data distributions of different tasks, but also the complexity of the tasks through the models used. The task complexity is taken into account by a novel variant of kernel mean embedding, combined with an instance-adaptive attention mechanism inspired by an SVM-based feature selection algorithm. Together with conditioning layers in deep neural networks, MATE can be easily incorporated into existing meta learners as a plug-and-play module. While MATE is widely applicable to general tasks where the concept of task/environment is involved, we demonstrate its effectiveness in few-shot learning by improving a state-of-the-art model consistently on two benchmarks. Source codes for this paper are available at https://github.com/VITA-Group/MATE.

Tasks

Reproductions