SOTAVerified

Experiments with LVT and FRE for Transformer model

2020-04-26Unverified0· sign in to hype

Ilshat Gibadullin, Aidar Valeev

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In this paper, we experiment with Large Vocabulary Trick and Feature-rich encoding applied to the Transformer model for Text Summarization. We could not achieve better results, than the analogous RNN-based sequence-to-sequence model, so we tried more models to find out, what improves the results and what deteriorates them.

Tasks

Reproductions