SOTAVerified

NMT

Neural machine translation is an approach to machine translation that uses an artificial neural network to predict the likelihood of a sequence of words, typically modeling entire sentences in a single integrated model.

Papers

Showing 14511500 of 1773 papers

TitleStatusHype
Correcting Length Bias in Neural Machine Translation0
Revisiting Character-Based Neural Machine Translation with Capacity and Compression0
An Operation Sequence Model for Explainable Neural Machine TranslationCode0
A Tree-based Decoder for Neural Machine TranslationCode0
A Study of Reinforcement Learning for Neural Machine TranslationCode0
Contextual Parameter Generation for Universal Neural Machine TranslationCode0
Meta-Learning for Low-Resource Neural Machine Translation0
Exploring Recombination for Efficient Decoding of Neural Machine TranslationCode0
Paraphrases as Foreign Languages in Multilingual Neural Machine Translation0
Style Transfer as Unsupervised Machine Translation0
SwitchOut: an Efficient Data Augmentation Algorithm for Neural Machine Translation0
Learning When to Concentrate or Divert Attention: Self-Adaptive Attention Temperature for Neural Machine TranslationCode0
Training Deeper Neural Machine Translation Models with Transparent AttentionCode0
Measuring Semantic Abstraction of Multilingual NMT with Paraphrase Recognition and Generation Tasks0
SentencePiece: A simple and language independent subword tokenizer and detokenizer for Neural Text Processing0
Neural Machine Translation of Text from Non-Native SpeakersCode0
Regularizing Neural Machine Translation by Target-bidirectional Agreement0
D-PAGE: Diverse Paraphrase Generation0
Ancient-Modern Chinese Translation with a Large Training Dataset0
Debugging Neural Machine TranslationsCode0
code2seq: Generating Sequences from Structured Representations of CodeCode0
Incorporating Syntactic Uncertainty in Neural Machine Translation with a Forest-to-Sequence Model0
Neural Machine Translation Incorporating Named Entity0
KIT Lecture Translator: Multilingual Speech Translation with One-Shot Learning0
Low-Latency Neural Speech Translation0
Adaptive Weighting for Neural Machine TranslationCode0
Neural Machine Translation with Decoding History Enhanced Attention0
Improving Neural Machine Translation by Incorporating Hierarchical Subword Features0
Tailoring Neural Architectures for Translating from Morphologically Rich Languages0
Effective Parallel Corpus Mining using Bilingual Sentence Embeddings0
Training Neural Machine Translation using Word Embedding-based Loss0
NMT-based Cross-lingual Document Embeddings0
Finding Better Subword Segmentation for Neural Machine TranslationCode0
Otem&Utem: Over- and Under-Translation Evaluation Metric for NMTCode0
Recurrent Stacking of Layers for Compact Neural Machine Translation Models0
NMT-Keras: a Very Flexible Toolkit with a Focus on Interactive NMT and Online Learning0
Testing Untestable Neural Machine Translation: An Industrial Case0
Regularized Training Objective for Continued Training for Domain Adaptation in Neural Machine TranslationCode0
Moon IME: Neural-based Chinese Pinyin Aided Input Method with Customizable Association0
Learning Distributional Token Representations from Visual Features0
Named-Entity Tagging and Domain adaptation for Better Customized Translation0
How Much Attention Do You Need? A Granular Analysis of Neural Machine Translation Architectures0
Neural Hidden Markov Model for Machine Translation0
SuperNMT: Neural Machine Translation with Semantic Supersenses and Syntactic Supertags0
Forest-Based Neural Machine Translation0
Neural Machine Translation Techniques for Named Entity TransliterationCode0
NICT Self-Training Approach to Neural Machine Translation at NMT-20180
Unsupervised Source Hierarchies for Low-Resource Neural Machine Translation0
A Simple and Effective Approach to Coverage-Aware Neural Machine Translation0
Are BLEU and Meaning Representation in Opposition?0
Show:102550
← PrevPage 30 of 36Next →

No leaderboard results yet.