SOTAVerified

NMT

Neural machine translation is an approach to machine translation that uses an artificial neural network to predict the likelihood of a sequence of words, typically modeling entire sentences in a single integrated model.

Papers

Showing 11511200 of 1773 papers

TitleStatusHype
Learning to Discriminate Noises for Incorporating External Information in Neural Machine Translation0
Learning to Generate Word- and Phrase-Embeddings for Efficient Phrase-Based Neural Machine Translation0
Learning to Multi-Task Learn for Better Neural Machine Translation0
Learning to Refine Source Representations for Neural Machine Translation0
Learning to Reuse Translations: Guiding Neural Machine Translation with Examples0
Learning to Segment Inputs for NMT Favors Character-Level Processing0
Leveraging Diverse Modeling Contexts with Collaborating Learning for Neural Machine Translation0
Leveraging GPT-4 for Automatic Translation Post-Editing0
Leveraging Monolingual Data with Self-Supervision for Multilingual Neural Machine Translation0
Lexical Micro-adaptation for Neural Machine Translation0
Lexicons and Minimum Risk Training for Neural Machine Translation: NAIST-CMU at WAT20160
Linguistically Informed Hindi-English Neural Machine Translation0
Linguistically Motivated Subwords for English-Tamil Translation: University of Groningen’s Submission to WMT-20200
Linguistically Motivated Vocabulary Reduction for Neural Machine Translation from Turkish to English0
Linguistically-Motivated Yorùbá-English Machine Translation0
Literary Machine Translation under the Magnifying Glass: Assessing the Quality of an NMT-Translated Detective Novel on Document Level0
LIUM's Contributions to the WMT2019 News Translation Task: Data and Systems for German-French Language Pairs0
Locality-Sensitive Hashing for Long Context Neural Machine Translation0
Long-Short Term Masking Transformer: A Simple but Effective Baseline for Document-level Neural Machine Translation0
Long Warm-up and Self-Training: Training Strategies of NICT-2 NMT System at WAT-20190
Look-ahead Attention for Generation in Neural Machine Translation0
Look Backward and Forward: Self-Knowledge Distillation with Bidirectional Decoder for Neural Machine Translation0
Look Harder: A Neural Machine Translation Model with Hard Attention0
Look It Up: Bilingual Dictionaries Improve Neural Machine Translation0
Low-Latency Neural Speech Translation0
Low-Resource Machine Translation Training Curriculum Fit for Low-Resource Languages0
Using Interlinear Glosses as Pivot in Low-Resource Multilingual Machine Translation0
Low Resource Multimodal Neural Machine Translation of English-Hindi in News Domain0
Low-resource Neural Machine Translation: Benchmarking State-of-the-art Transformer for Wolof<->French0
LTRC-MT Simple \& Effective Hindi-English Neural Machine Translation Systems at WAT 20190
Machine-oriented NMT Adaptation for Zero-shot NLP tasks: Comparing the Usefulness of Close and Distant Languages0
Machine Translation at Booking.com: Journey and Lessons Learned0
Machine Translation by Projecting Text into the Same Phonetic-Orthographic Space Using a Common Encoding0
Machine Translationese: Effects of Algorithmic Bias on Linguistic Complexity in Machine Translation0
Machine Translation for English–Inuktitut with Segmentation, Data Acquisition and Pre-Training0
Machine Translation for Livonian: Catering to 20 Speakers0
Machine Translation for Machines: the Sentiment Classification Use Case0
Machine Translation of 16Th Century Letters from Latin to German0
Machine Translation Quality: A comparative evaluation of SMT, NMT and tailored-NMT outputs0
MacNet: Transferring Knowledge from Machine Comprehension to Sequence-to-Sequence Models0
Make the Blind Translator See The World: A Novel Transfer Learning Solution for Multimodal Machine Translation0
Massively Multilingual Neural Machine Translation0
Massively Multilingual Neural Machine Translation in the Wild: Findings and Challenges0
Master Thesis: Neural Sign Language Translation by Learning Tokenization0
KÚ <MASK>: Integrating Yorùbá cultural greetings into machine translation0
MBR and QE Finetuning: Training-time Distillation of the Best and Most Expensive Decoding Methods0
Meaningless yet meaningful: Morphology grounded subword-level NMT0
Measuring and Improving Faithfulness of Attention in Neural Machine Translation0
Measuring and Mitigating Name Biases in Neural Machine Translation0
Meet Changes with Constancy: Learning Invariance in Multi-Source Translation0
Show:102550
← PrevPage 24 of 36Next →

No leaderboard results yet.