SOTAVerified

Sentence

Papers

Showing 76100 of 10752 papers

TitleStatusHype
RetroMAE v2: Duplex Masked Auto-Encoder For Pre-Training Retrieval-Oriented Language ModelsCode2
Towards Realistic Low-resource Relation Extraction: A Benchmark with Empirical Baseline StudyCode2
CCTC: A Cross-Sentence Chinese Text Correction Dataset for Native SpeakersCode2
TEACH: Temporal Action Composition for 3D HumansCode2
Comprehending and Ordering Semantics for Image CaptioningCode2
Compositional Visual Generation with Composable Diffusion ModelsCode2
RetroMAE: Pre-Training Retrieval-oriented Language Models Via Masked Auto-EncoderCode2
"I'm sorry to hear that": Finding New Biases in Language Models with a Holistic Descriptor DatasetCode2
NaturalSpeech: End-to-End Text to Speech Synthesis with Human-Level QualityCode2
MuCGEC: a Multi-Reference Multi-Source Evaluation Dataset for Chinese Grammatical Error CorrectionCode2
Exploring a Fine-Grained Multiscale Method for Cross-Modal Remote Sensing Image RetrievalCode2
DiffCSE: Difference-based Contrastive Learning for Sentence EmbeddingsCode2
CampNet: Context-Aware Mask Prediction for End-to-End Text-Based Speech EditingCode2
SGPT: GPT Sentence Embeddings for Semantic SearchCode2
PromptBERT: Improving BERT Sentence Embeddings with PromptsCode2
CVSS Corpus and Massively Multilingual Speech-to-Speech TranslationCode2
Deduplicating Training Data Makes Language Models BetterCode2
SimCSE: Simple Contrastive Learning of Sentence EmbeddingsCode2
Pretrained Transformers for Text Ranking: BERT and BeyondCode2
Abstractive Summarization of Spoken andWritten Instructions with BERTCode2
Reevaluating Adversarial Examples in Natural LanguageCode2
MPNet: Masked and Permuted Pre-training for Language UnderstandingCode2
CLUE: A Chinese Language Understanding Evaluation BenchmarkCode2
ALBERT: A Lite BERT for Self-supervised Learning of Language RepresentationsCode2
PAWS-X: A Cross-lingual Adversarial Dataset for Paraphrase IdentificationCode2
Show:102550
← PrevPage 4 of 431Next →

No leaderboard results yet.