SOTAVerified

Cross-Lingual Natural Language Inference

Using data and models available for one language for which ample such resources are available (e.g., English) to solve a natural language inference task in another, commonly more low-resource, language.

Papers

Showing 1120 of 31 papers

TitleStatusHype
Bridging Cross-Lingual Gaps During Leveraging the Multilingual Sequence-to-Sequence Pretraining for Text Generation and UnderstandingCode0
mGPT: Few-Shot Learners Go MultilingualCode2
Subword Mapping and Anchoring across LanguagesCode1
PARADISE: Exploiting Parallel Data for Multilingual Sequence-to-Sequence PretrainingCode0
Data Augmentation with Adversarial Training for Cross-Lingual NLI0
Soft Layer Selection with Meta-Learning for Zero-Shot Cross-Lingual Transfer0
Language Embeddings for Typology and Cross-lingual Transfer LearningCode0
ByT5: Towards a token-free future with pre-trained byte-to-byte modelsCode1
Cross-Lingual Transfer with MAML on Trees0
SILT: Efficient transformer training for inter-lingual inferenceCode0
Show:102550
← PrevPage 2 of 4Next →

No leaderboard results yet.