Hybrid Neural Models For Sequence Modelling: The Best Of Three Worlds
2019-09-16Unverified0· sign in to hype
Marco Dinarelli, Loïc Grobol
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
We propose a neural architecture with the main characteristics of the most successful neural models of the last years: bidirectional RNNs, encoder-decoder, and the Transformer model. Evaluation on three sequence labelling tasks yields results that are close to the state-of-the-art for all tasks and better than it for some of them, showing the pertinence of this hybrid architecture for this kind of tasks.