Collaborative Multi-Task Representation for Natural Language Understanding
Anonymous
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
Multi-task learning has shown large benefits in natural language understanding tasks. However, the seesaw phenomenon is still prominent in existing multi-task learning solutions. In other words, it is difficult to balance the importance of different tasks to learn a unified representation of natural language. In this paper, we propose a Collaborative Multi-Task Representation (CMTR) framework to tackle this problem. Specifically, we capture instance-level task relations by a task interaction layer and calculate the final representation for each instance as a mixture of task-oriented representations. We also introduce auxiliary losses to ease optimization and improve the generalization ability of multi-task representations. Empirically, CMTR outperforms state-of-the-art multi-task learning frameworks for natural language understanding tasks. We also reveal the effectiveness of CMTR through extensive analyses.