SOTAVerified

NMT

Neural machine translation is an approach to machine translation that uses an artificial neural network to predict the likelihood of a sequence of words, typically modeling entire sentences in a single integrated model.

Papers

Showing 9511000 of 1773 papers

TitleStatusHype
Fine-Grained Error Analysis on English-to-Japanese Machine Translation in the Medical Domain0
Fine Grained Human Evaluation for English-to-Chinese Machine Translation: A Case Study on Scientific Text0
Finetuning a Kalaallisut-English machine translation system using web-crawled data0
First Experiments with Neural Translation of Informal to Formal Mathematics0
Fix-Filter-Fix: Intuitively Connect Any Models for Effective Bug Fixing0
Fixing exposure bias with imitation learning needs powerful oracles0
FJWU Participation for the WMT21 Biomedical Translation Task0
Flow-Adapter Architecture for Unsupervised Machine Translation0
Forest-Based Neural Machine Translation0
FrameNet Annotations Alignment using Attention-based Machine Translation0
FreeTransfer-X: Safe and Label-Free Cross-Lingual Transfer from Off-the-Shelf Models0
Frequency-Aware Contrastive Learning for Neural Machine Translation0
From Balustrades to Pierre Vinken: Looking for Syntax in Transformer Self-Attentions0
From LLM to NMT: Advancing Low-Resource Machine Translation with Claude0
Fusing Recency into Neural Machine Translation with an Inter-Sentence Gate Model0
Future-Prediction-Based Model for Neural Machine Translation0
GATE X-E : A Challenge Set for Gender-Fair Translations from Weakly-Gendered Languages0
Gender Aware Spoken Language Translation Applied to English-Arabic0
Gender Bias Amplification During Speed-Quality Optimization in Neural Machine Translation0
Gender-specific Machine Translation with Large Language Models0
General2Specialized LLMs Translation for E-commerce0
Generalization algorithm of multimodal pre-training model based on graph-text self-supervised training0
Generalizing Back-Translation in Neural Machine Translation0
Generating Authentic Adversarial Examples beyond Meaning-preserving with Doubly Round-trip Translation0
Generating Diverse Translation from Model Distribution with Dropout0
Generating Gender Augmented Data for NLP0
Getting Gender Right in Neural Machine Translation0
GraphSeq2Seq: Graph-Sequence-to-Sequence for Neural Machine Translation0
Graph-to-Sequence Neural Machine Translation0
Guider l'attention dans les modeles de sequence a sequence pour la prediction des actes de dialogue0
Guiding Neural Machine Translation with Retrieved Translation Pieces0
Hallucinations in Neural Machine Translation0
Hallucinations in neural machine translation0
Handling Homographs in Neural Machine Translation0
Hard but Robust, Easy but Sensitive: How Encoder and Decoder Perform in Neural Machine Translation0
HausaMT v1.0: Towards English-Hausa Neural Machine Translation0
Heterogeneous Recycle Generation for Chinese Grammatical Error Correction0
HI-CMLM: Improve CMLM with Hybrid Decoder Input0
Hierarchical Modeling of Global Context for Document-Level Neural Machine Translation0
Hierarchical Sequence to Sequence Voice Conversion with Limited Data0
High-Quality Data Augmentation for Low-Resource NMT: Combining a Translation Memory, a GAN Generator, and Filtering0
HilMeMe: A Human-in-the-Loop Machine Translation Evaluation Metric Looking into Multi-Word Expressions0
Hindi-Marathi Cross Lingual Model0
Hindi to English: Transformer-Based Neural Machine Translation0
HintedBT: Augmenting Back-Translation with Quality and Transliteration Hints0
How Does Pretraining Improve Discourse-Aware Translation?0
How Do Source-side Monolingual Word Embeddings Impact Neural Machine Translation?0
How Effective is Byte Pair Encoding for Out-Of-Vocabulary Words in Neural Machine Translation?0
How Much Attention Do You Need? A Granular Analysis of Neural Machine Translation Architectures0
How Much Does Tokenization Affect Neural Machine Translation?0
Show:102550
← PrevPage 20 of 36Next →

No leaderboard results yet.