Self-training improves Recurrent Neural Networks performance for Temporal Relation Extraction
2018-10-01WS 2018Unverified0· sign in to hype
Chen Lin, Timothy Miller, Dmitriy Dligach, Hadi Amiri, Steven Bethard, Guergana Savova
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
Neural network models are oftentimes restricted by limited labeled instances and resort to advanced architectures and features for cutting edge performance. We propose to build a recurrent neural network with multiple semantically heterogeneous embeddings within a self-training framework. Our framework makes use of labeled, unlabeled, and social media data, operates on basic features, and is scalable and generalizable. With this method, we establish the state-of-the-art result for both in- and cross-domain for a clinical temporal relation extraction task.