SOTAVerified

Machine Translation

Machine translation is the task of translating a sentence in a source language to a different target language.

Approaches for machine translation can range from rule-based to statistical to neural-based. More recently, encoder-decoder attention-based architectures like BERT have attained major improvements in machine translation.

One of the most popular datasets used to benchmark machine translation systems is the WMT family of datasets. Some of the most commonly used evaluation metrics for machine translation systems include BLEU, METEOR, NIST, and others.

( Image credit: Google seq2seq )

Papers

Showing 56515700 of 10752 papers

TitleStatusHype
Neural machine translation system for the Kazakh language0
Neural Machine Translation System using a Content-equivalently Translated Parallel Corpus for the Newswire Translation Tasks at WAT 20190
Neural Machine Translation Training in a Multi-Domain Scenario0
Neural Machine Translation Using Extracted Context Based on Deep Analysis for the Japanese-English Newswire Task at WAT 20200
Neural Machine Translation via Binary Code Prediction0
Neural Machine Translation with 4-Bit Precision and Beyond0
Neural Machine Translation with Adequacy-Oriented Learning0
Neural Machine Translation with Decoding History Enhanced Attention0
Neural Machine Translation with Dynamic Graph Convolutional Decoder0
Neural Machine Translation with Explicit Phrase Alignment0
Neural Machine Translation with Extended Context0
Neural Machine Translation with External Phrase Memory0
Neural Machine Translation with Gumbel-Greedy Decoding0
Neural Machine Translation with Inflected Lexicon0
Neural Machine Translation with Key-Value Memory-Augmented Attention0
Neural Machine Translation with Latent Semantic of Image and Text0
Neural Machine Translation with Noisy Lexical Constraints0
Neural Machine Translation with Pivot Languages0
Neural Machine Translation with Recurrent Attention Modeling0
Neural Machine Translation with Reordering Embeddings0
Neural Machine Translation with Source Dependency Representation0
Neural Machine Translation with Source-Side Latent Graph Parsing0
Neural Machine Translation with Supervised Attention0
Neural Machine Translation with Synchronous Latent Phrase Structure0
Neural Machine Translation with the Transformer and Multi-Source Romance Languages for the Biomedical WMT 2018 task0
Neural Machine Translation with Word Predictions0
Neural Metaphor Detecting with CNN-LSTM Model0
Neural Monkey: The Current State and Beyond0
Neural Morphological Analysis: Encoding-Decoding Canonical Segments0
Neural Morphological Tagging of Lemma Sequences for Machine Translation0
Neural Name Translation Improves Neural Machine Translation0
Neural Network Architectures for Arabic Dialect Identification0
Neural Network Based Bilingual Language Model Growing for Statistical Machine Translation0
Neural Network-Based Model for Japanese Predicate Argument Structure Analysis0
Neural Network Language Models for Candidate Scoring in Hybrid Multi-System Machine Translation0
Neural Networks for Multi-Word Expression Detection0
Neural Networks For Negation Scope Detection0
Neural Networks in a Product of Hyperbolic Spaces0
Neural Network Transduction Models in Transliteration Generation0
Neural Optimizer Search using Reinforcement Learning0
Neural Paraphrase Generation using Transfer Learning0
Neural Phrase-to-Phrase Machine Translation0
Neural Poetry Translation0
Neural Polysynthetic Language Modelling0
Neural Post-Editing Based on Quality Estimation0
Neural Pre-Translation for Hybrid Machine Translation0
Neural Probabilistic Language Model for System Combination0
Neural Program Planner for Structured Predictions0
Neural Proto-Language Reconstruction0
Neural Reordering Model Considering Phrase Translation and Word Alignment for Phrase-based Translation0
Show:102550
← PrevPage 114 of 216Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1Transformer Cycle (Rev)BLEU score35.14Unverified
2Noisy back-translationBLEU score35Unverified
3Transformer+Rep(Uni)BLEU score33.89Unverified
4T5-11BBLEU score32.1Unverified
5BiBERTBLEU score31.26Unverified
6Transformer + R-DropBLEU score30.91Unverified
7Bi-SimCutBLEU score30.78Unverified
8BERT-fused NMTBLEU score30.75Unverified
9Data Diversification - TransformerBLEU score30.7Unverified
10SimCutBLEU score30.56Unverified
#ModelMetricClaimedVerifiedStatus
1Transformer+BT (ADMIN init)BLEU score46.4Unverified
2Noisy back-translationBLEU score45.6Unverified
3mRASP+Fine-TuneBLEU score44.3Unverified
4Transformer + R-DropBLEU score43.95Unverified
5Transformer (ADMIN init)BLEU score43.8Unverified
6AdminBLEU score43.8Unverified
7BERT-fused NMTBLEU score43.78Unverified
8MUSE(Paralllel Multi-scale Attention)BLEU score43.5Unverified
9T5BLEU score43.4Unverified
10Local Joint Self-attentionBLEU score43.3Unverified
#ModelMetricClaimedVerifiedStatus
1PiNMTBLEU score40.43Unverified
2BiBERTBLEU score38.61Unverified
3Bi-SimCutBLEU score38.37Unverified
4Cutoff + Relaxed Attention + LMBLEU score37.96Unverified
5DRDABLEU score37.95Unverified
6Transformer + R-Drop + CutoffBLEU score37.9Unverified
7SimCutBLEU score37.81Unverified
8Cutoff+KneeBLEU score37.78Unverified
9CutoffBLEU score37.6Unverified
10CipherDAugBLEU score37.53Unverified
#ModelMetricClaimedVerifiedStatus
1HWTSC-Teacher-SimScore19.97Unverified
2MS-COMET-22Score19.89Unverified
3MS-COMET-QE-22Score19.76Unverified
4KG-BERTScoreScore17.28Unverified
5metricx_xl_DA_2019Score17.17Unverified
6COMET-QEScore16.8Unverified
7COMET-22Score16.31Unverified
8UniTE-srcScore15.68Unverified
9UniTE-refScore15.38Unverified
10metricx_xxl_DA_2019Score15.24Unverified