SOTAVerified

Machine Translation

Machine translation is the task of translating a sentence in a source language to a different target language.

Approaches for machine translation can range from rule-based to statistical to neural-based. More recently, encoder-decoder attention-based architectures like BERT have attained major improvements in machine translation.

One of the most popular datasets used to benchmark machine translation systems is the WMT family of datasets. Some of the most commonly used evaluation metrics for machine translation systems include BLEU, METEOR, NIST, and others.

( Image credit: Google seq2seq )

Papers

Showing 60016050 of 10752 papers

TitleStatusHype
The Karlsruhe Institute of Technology Systems for the News Translation Task in WMT 20180
The JHU Parallel Corpus Filtering Systems for WMT 20180
The JHU Machine Translation Systems for WMT 20180
The LMU Munich Unsupervised Machine Translation Systems0
Towards Less Generic Responses in Neural Conversation Models: A Statistical Re-weighting MethodCode0
The MLLP-UPV German-English Machine Translation System for WMT180
Tilde's Parallel Corpus Filtering Methods for WMT 20180
The IWSLT 2018 Evaluation Campaign0
The University of Edinburgh's Submissions to the WMT18 News Translation Task0
The University of Helsinki submissions to the WMT18 news task0
Turku Neural Parser Pipeline: An End-to-End System for the CoNLL 2018 Shared Task0
The University of Maryland's Chinese-English Neural Machine Translation Systems at WMT180
The Word Sense Disambiguation Test Suite at WMT180
The WMT'18 Morpheval test suites for English-Czech, English-German, English-Finnish and Turkish-English0
Translating a Math Word Problem to a Expression Tree0
UTFPR at WMT 2018: Minimalistic Supervised Corpora Filtering for Machine Translation0
The ILSP/ARC submission to the WMT 2018 Parallel Corpus Filtering Shared Task0
Using Spoken Word Posterior Features in Neural Machine Translation0
Three Strategies to Improve One-to-Many Multilingual Translation0
The RWTH Aachen University Supervised Machine Translation Systems for WMT 20180
Phrase-Based Attentions0
Discrete Structural Planning for Generating Diverse Translations0
Differentiable Expected BLEU for Text Generation0
Connecting the Dots Between MLE and RL for Sequence Generation0
AutoLoss: Learning Discrete Schedule for Alternate Optimization0
Hallucinations in Neural Machine Translation0
Assumption Questioning: Latent Copying and Reward Exploitation in Question Generation0
GraphSeq2Seq: Graph-Sequence-to-Sequence for Neural Machine Translation0
Wronging a Right: Generating Better Errors to Improve Grammatical Error DetectionCode0
Fast and Simple Mixture of Softmaxes with BPE and Hybrid-LightRNN for Language GenerationCode0
Predicting protein secondary structure with Neural Machine Translation0
Semi-Supervised Sequence Modeling with Cross-View TrainingCode0
Attention-based Encoder-Decoder Networks for Spelling and Grammatical Error Correction0
NICT's Corpus Filtering Systems for the WMT18 Parallel Corpus Filtering Task0
NICT's Neural and Statistical Machine Translation Systems for the WMT18 News Translation Task0
FRAGE: Frequency-Agnostic Word RepresentationCode0
Multi-task Learning with Sample Re-weighting for Machine Reading ComprehensionCode0
Quantum Statistics-Inspired Neural Attention0
Comparison of Deep Learning and the Classical Machine Learning Algorithm for the Malware Detection0
Freezing Subnetworks to Analyze Domain Adaptation in Neural Machine Translation0
XNLI: Evaluating Cross-lingual Sentence RepresentationsCode0
Zero-Shot Cross-lingual Classification Using Multilingual Neural Machine Translation0
On The Alignment Problem In Multi-Head Attention-Based Neural Machine Translation0
Greedy Search with Probabilistic N-gram Matching for Neural Machine TranslationCode0
Multilingual Extractive Reading Comprehension by Runtime Machine TranslationCode0
Towards one-shot learning for rare-word translation with external experts0
Speeding Up Neural Machine Translation Decoding by Cube Pruning0
An Empirical Investigation into Learning Bug-Fixing Patches in the Wild via Neural Machine Translation0
Cell-aware Stacked LSTMs for Modeling Sentences0
Logographic Subword Model for Neural Machine Translation0
Show:102550
← PrevPage 121 of 216Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1Transformer Cycle (Rev)BLEU score35.14Unverified
2Noisy back-translationBLEU score35Unverified
3Transformer+Rep(Uni)BLEU score33.89Unverified
4T5-11BBLEU score32.1Unverified
5BiBERTBLEU score31.26Unverified
6Transformer + R-DropBLEU score30.91Unverified
7Bi-SimCutBLEU score30.78Unverified
8BERT-fused NMTBLEU score30.75Unverified
9Data Diversification - TransformerBLEU score30.7Unverified
10SimCutBLEU score30.56Unverified
#ModelMetricClaimedVerifiedStatus
1Transformer+BT (ADMIN init)BLEU score46.4Unverified
2Noisy back-translationBLEU score45.6Unverified
3mRASP+Fine-TuneBLEU score44.3Unverified
4Transformer + R-DropBLEU score43.95Unverified
5Transformer (ADMIN init)BLEU score43.8Unverified
6AdminBLEU score43.8Unverified
7BERT-fused NMTBLEU score43.78Unverified
8MUSE(Paralllel Multi-scale Attention)BLEU score43.5Unverified
9T5BLEU score43.4Unverified
10Local Joint Self-attentionBLEU score43.3Unverified
#ModelMetricClaimedVerifiedStatus
1PiNMTBLEU score40.43Unverified
2BiBERTBLEU score38.61Unverified
3Bi-SimCutBLEU score38.37Unverified
4Cutoff + Relaxed Attention + LMBLEU score37.96Unverified
5DRDABLEU score37.95Unverified
6Transformer + R-Drop + CutoffBLEU score37.9Unverified
7SimCutBLEU score37.81Unverified
8Cutoff+KneeBLEU score37.78Unverified
9CutoffBLEU score37.6Unverified
10CipherDAugBLEU score37.53Unverified
#ModelMetricClaimedVerifiedStatus
1HWTSC-Teacher-SimScore19.97Unverified
2MS-COMET-22Score19.89Unverified
3MS-COMET-QE-22Score19.76Unverified
4KG-BERTScoreScore17.28Unverified
5metricx_xl_DA_2019Score17.17Unverified
6COMET-QEScore16.8Unverified
7COMET-22Score16.31Unverified
8UniTE-srcScore15.68Unverified
9UniTE-refScore15.38Unverified
10metricx_xxl_DA_2019Score15.24Unverified