SOTAVerified

NMT

Neural machine translation is an approach to machine translation that uses an artificial neural network to predict the likelihood of a sequence of words, typically modeling entire sentences in a single integrated model.

Papers

Showing 11511175 of 1773 papers

TitleStatusHype
Learning to Discriminate Noises for Incorporating External Information in Neural Machine Translation0
Learning to Generate Word- and Phrase-Embeddings for Efficient Phrase-Based Neural Machine Translation0
Learning to Multi-Task Learn for Better Neural Machine Translation0
Learning to Refine Source Representations for Neural Machine Translation0
Learning to Reuse Translations: Guiding Neural Machine Translation with Examples0
Learning to Segment Inputs for NMT Favors Character-Level Processing0
Leveraging Diverse Modeling Contexts with Collaborating Learning for Neural Machine Translation0
Leveraging GPT-4 for Automatic Translation Post-Editing0
Leveraging Monolingual Data with Self-Supervision for Multilingual Neural Machine Translation0
Lexical Micro-adaptation for Neural Machine Translation0
Lexicons and Minimum Risk Training for Neural Machine Translation: NAIST-CMU at WAT20160
Linguistically Informed Hindi-English Neural Machine Translation0
Linguistically Motivated Subwords for English-Tamil Translation: University of Groningen’s Submission to WMT-20200
Linguistically Motivated Vocabulary Reduction for Neural Machine Translation from Turkish to English0
Linguistically-Motivated Yorùbá-English Machine Translation0
Literary Machine Translation under the Magnifying Glass: Assessing the Quality of an NMT-Translated Detective Novel on Document Level0
LIUM's Contributions to the WMT2019 News Translation Task: Data and Systems for German-French Language Pairs0
Locality-Sensitive Hashing for Long Context Neural Machine Translation0
Long-Short Term Masking Transformer: A Simple but Effective Baseline for Document-level Neural Machine Translation0
Long Warm-up and Self-Training: Training Strategies of NICT-2 NMT System at WAT-20190
Look-ahead Attention for Generation in Neural Machine Translation0
Look Backward and Forward: Self-Knowledge Distillation with Bidirectional Decoder for Neural Machine Translation0
Look Harder: A Neural Machine Translation Model with Hard Attention0
Look It Up: Bilingual Dictionaries Improve Neural Machine Translation0
Low-Latency Neural Speech Translation0
Show:102550
← PrevPage 47 of 71Next →

No leaderboard results yet.