SOTAVerified

NMT

Neural machine translation is an approach to machine translation that uses an artificial neural network to predict the likelihood of a sequence of words, typically modeling entire sentences in a single integrated model.

Papers

Showing 901950 of 1773 papers

TitleStatusHype
Using Context in Neural Machine Translation Training Objectives0
Using Images to Improve Machine-Translating E-Commerce Product Listings.0
Investigating Massive Multilingual Pre-Trained Machine Translation Models for Clinical Domain via Transfer Learning0
Enhanced back-translation for low resource neural machine translation using self-training0
Using Semantic Role Labeling to Improve Neural Machine Translation0
Using Semantic Similarity as Reward for Reinforcement Learning in Sentence Generation0
Using Spoken Word Posterior Features in Neural Machine Translation0
Using Target-side Monolingual Data for Neural Machine Translation through Multi-task Learning0
Utilizing Monolingual Data in NMT for Similar Languages: Submission to Similar Language Translation Task0
Validation of Tsallis Entropy In Inter-Modality Neuroimage Registration0
Variational Neural Machine Translation with Normalizing Flows0
Variational Recurrent Neural Machine Translation0
Vector-Vector-Matrix Architecture: A Novel Hardware-Aware Framework for Low-Latency Inference in NLP Applications0
Visualizing and Understanding Neural Machine Translation0
Vocabulary Adaptation for Distant Domain Adaptation in Neural Machine Translation0
What Level of Quality can Neural Machine Translation Attain on Literary Text?0
What Makes Word-level Neural Machine Translation Hard: A Case Study on English-German Translation0
What Role Does BERT Play in the Neural Machine Translation Encoder?0
What Works and Doesn't Work, A Deep Decoder for Neural Machine Translation0
What Works and Doesn’t Work, A Deep Decoder for Neural Machine Translation0
When and Why is Unsupervised Neural Machine Translation Useless?0
When a `sport' is a person and other issues for NMT of novels0
When do Contrastive Word Alignments Improve Many-to-many Neural Machine Translation?0
When do Contrastive Word Alignments Improve Many-to-many Neural Machine Translation?0
When is Char Better Than Subword: A Systematic Study of Segmentation Algorithms for Neural Machine Translation0
Who Evaluates the Evaluators? On Automatic Metrics for Assessing AI-based Offensive Code Generators0
Why Find the Right One?0
Why Neural Machine Translation Prefers Empty Outputs0
Word Alignment in the Era of Deep Learning: A Tutorial0
Word Rewarding for Adequate Neural Machine Translation0
WT: Wipro AI Submissions to the WAT 20200
XMU Neural Machine Translation Online Service0
xSIM++: An Improved Proxy to Bitext Mining Performance for Low-Resource Languages0
YANMTT: Yet Another Neural Machine Translation Toolkit0
Zero-Resource Neural Machine Translation with Multi-Agent Communication Game0
Zero-Resource Neural Machine Translation with Monolingual Pivot Data0
Zero-Shot Cross-lingual Classification Using Multilingual Neural Machine Translation0
Zero-Shot Neural Machine Translation: Russian-Hindi @LoResMT 20200
Zero-Shot Neural Machine Translation with Self-Learning Cycle0
Zero-shot translation among Indian languages0
Zero-Shot Translation using Diffusion Models0
Neural Machine Translation with Recurrent Highway Networks0
FFR v1.1: Fon-French Neural Machine Translation0
Filtering Back-Translated Data in Unsupervised Neural Machine Translation0
Finding Sami Cognates with a Character-Based NMT Approach0
Findings of the Fourth Workshop on Neural Generation and Translation0
Findings of the WMT 2018 Shared Task on Automatic Post-Editing0
Findings of the WMT 2020 Shared Task on Automatic Post-Editing0
Finding the Right Recipe for Low Resource Domain Adaptation in Neural Machine Translation0
Fine-Grained Attention Mechanism for Neural Machine Translation0
Show:102550
← PrevPage 19 of 36Next →

No leaderboard results yet.