SOTAVerified

Sentence

Papers

Showing 39764000 of 10752 papers

TitleStatusHype
Knowledge Enhanced Embedding: Improve Model Generalization Through Knowledge Graphs0
Effective Unsupervised Constrained Text Generation based on Perturbed Masking0
Looking Into the Black Box - How Are Idioms Processed in BERT?0
A Novel End-to-End CAPT System for L2 Children Learners0
One-to-Many and Many-to-One Dialogue Learning via Sentence Semantic Segmentation Guided Conditional Variational Auto-Encoder0
NSP-BERT: A Prompt-based Zero-Shot Learner Through an Original Pre-training Task —— Next Sentence Prediction0
DocEE: A Large-Scale and Fine-grained Benchmark for Document-level Event Extraction0
Divide and Rule: Effective Pre-Training for Context-Aware Multi-Encoder Translation Models0
MarkBERT: Marking Word Boundaries Improves Chinese BERTCode1
DISAPERE: A Dataset for Discourse Structure in Peer Review Discussions0
Language Level Classification on German Texts using a Neural Approach0
DGMED: A Novel Document-Level Graph Convolution Network for Multi-Event Detection0
An Empirical Study of Document-to-document Neural Machine Translation0
Representation of Ambiguity in Pre-Trained Sentence Embeddings0
RoMe: A Robust Metric for Evaluating Natural Language Generation0
GenRE: A Generative Model for Relation Extraction0
PRIMERA: Pyramid-based Masked Sentence Pre-training for Multi-document SummarizationCode1
A Natural Diet: Towards Improving Naturalness of Machine Translation Output0
RE: A Study for Restorable Embeddings0
Utterance Rewriting with Contrastive Learning in Multi-turn Dialogue0
The Change that Matters in Discourse Parsing: Estimating the Impact of Domain Shift on Parser Error0
Good Examples Make A Faster Learner: Simple Demonstration-based Learning for Low-resource NER0
Towards Coherent Visual Storytelling with Ordered Image Attention0
Co-training an Unsupervised Constituency Parser with Weak Supervision0
Toward Fine-grained Causality Reasoning and CausalQA0
Show:102550
← PrevPage 160 of 431Next →

No leaderboard results yet.