SOTAVerified

Cross-Lingual Natural Language Inference

Using data and models available for one language for which ample such resources are available (e.g., English) to solve a natural language inference task in another, commonly more low-resource, language.

Papers

Showing 131 of 31 papers

TitleStatusHype
Cross-Document Cross-Lingual NLI via RST-Enhanced Graph Fusion and Interpretability Prediction0
Leveraging Entailment Judgements in Cross-Lingual SummarisationCode0
Do Multilingual Language Models Think Better in English?Code1
Enhancing Cross-lingual Natural Language Inference by Soft Prompting with Multilingual VerbalizerCode0
Robust Unsupervised Cross-Lingual Word Embedding using Domain Flow Interpolation0
Alexa Teacher Model: Pretraining and Distilling Multi-Billion-Parameter Encoders for Natural Language Understanding Systems0
A Deep Transfer Learning Method for Cross-Lingual Natural Language Inference0
Nebula-I: A General Framework for Collaboratively Training Deep Learning Models on Low-Bandwidth Cloud ClustersCode0
Enhancing Cross-lingual Natural Language Inference by Prompt-learning from Cross-lingual TemplatesCode0
PARADISE”:" Exploiting Parallel Data for Multilingual Sequence-to-Sequence Pretraining0
Bridging Cross-Lingual Gaps During Leveraging the Multilingual Sequence-to-Sequence Pretraining for Text Generation and UnderstandingCode0
mGPT: Few-Shot Learners Go MultilingualCode2
Subword Mapping and Anchoring across LanguagesCode1
PARADISE: Exploiting Parallel Data for Multilingual Sequence-to-Sequence PretrainingCode0
Data Augmentation with Adversarial Training for Cross-Lingual NLI0
Soft Layer Selection with Meta-Learning for Zero-Shot Cross-Lingual Transfer0
Language Embeddings for Typology and Cross-lingual Transfer LearningCode0
ByT5: Towards a token-free future with pre-trained byte-to-byte modelsCode1
Cross-Lingual Transfer with MAML on Trees0
SILT: Efficient transformer training for inter-lingual inferenceCode0
Meta-Learning with MAML on Trees0
Rethinking embedding coupling in pre-trained language modelsCode0
Better Fine-Tuning by Reducing Representational CollapseCode1
On Learning Universal Representations Across Languages0
Meemi: A Simple Method for Post-processing and Integrating Cross-lingual Word Embeddings0
Unicoder: A Universal Language Encoder by Pre-training with Multiple Cross-lingual Tasks0
XLDA: Cross-Lingual Data Augmentation for Natural Language Inference and Question Answering0
Massively Multilingual Sentence Embeddings for Zero-Shot Cross-Lingual Transfer and BeyondCode1
BERT: Pre-training of Deep Bidirectional Transformers for Language UnderstandingCode3
XNLI: Evaluating Cross-lingual Sentence RepresentationsCode0
Supervised Learning of Universal Sentence Representations from Natural Language Inference DataCode1
Show:102550

No leaderboard results yet.