SOTAVerified

Sentence

Papers

Showing 651675 of 10752 papers

TitleStatusHype
Don't Stop Pretraining? Make Prompt-based Fine-tuning Powerful LearnerCode1
A Two-Stream AMR-enhanced Model for Document-level Event Argument ExtractionCode1
AugCSE: Contrastive Sentence Embedding with Diverse AugmentationsCode1
Double Graph Based Reasoning for Document-level Relation ExtractionCode1
An Analysis of Simple Data Augmentation for Named Entity RecognitionCode1
A Better Way to Do Masked Language Model ScoringCode1
Augmenting Transformers with Recursively Composed Multi-grained RepresentationsCode1
Dual-Alignment Pre-training for Cross-lingual Sentence EmbeddingCode1
A Unified Span-Based Approach for Opinion Mining with Syntactic ConstituentsCode1
A unified approach to sentence segmentation of punctuated text in many languagesCode1
Dynamic Self-Attention : Computing Attention over Words Dynamically for Sentence EmbeddingCode1
EASE: Entity-Aware Contrastive Learning of Sentence EmbeddingCode1
SentenceVAE: Enable Next-sentence Prediction for Large Language Models with Faster Speed, Higher Accuracy and Longer ContextCode1
A Cascade Dual-Decoder Model for Joint Entity and Relation ExtractionCode1
Anaphor Assisted Document-Level Relation ExtractionCode1
AutoAD-Zero: A Training-Free Framework for Zero-Shot Audio DescriptionCode1
A Decomposable Attention Model for Natural Language InferenceCode1
Efficient Few-shot Learning for Multi-label Classification of Scientific Documents with Many ClassesCode1
Efficient Intent Detection with Dual Sentence EncodersCode1
Efficient Neural Architecture for Text-to-Image SynthesisCode1
The MSR-Video to Text Dataset with Clean AnnotationsCode1
Automatic Charge Identification from Facts: A Few Sentence-Level Charge Annotations is All You NeedCode1
Adversarial Attack and Defense of Structured Prediction ModelsCode1
An Attribution Method for Siamese EncodersCode1
Content Planning for Neural Story Generation with Aristotelian RescoringCode1
Show:102550
← PrevPage 27 of 431Next →

No leaderboard results yet.