SOTAVerified

KIT's IWSLT 2020 SLT Translation System

2020-07-01WS 2020Unverified0· sign in to hype

Ngoc-Quan Pham, Felix Schneider, Tuan-Nam Nguyen, Thanh-Le Ha, Thai Son Nguyen, Maximilian Awiszus, Sebastian St{\"u}ker, Alex Waibel, er

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

This paper describes KIT's submissions to the IWSLT2020 Speech Translation evaluation campaign. We first participate in the simultaneous translation task, in which our simultaneous models are Transformer based and can be efficiently trained to obtain low latency with minimized compromise in quality. On the offline speech translation task, we applied our new Speech Transformer architecture to end-to-end speech translation. The obtained model can provide translation quality which is competitive to a complicated cascade. The latter still has the upper hand, thanks to the ability to transparently access to the transcription, and resegment the inputs to avoid fragmentation.

Tasks

Reproductions