SOTAVerified

Machine Reading Comprehension

Machine Reading Comprehension is one of the key problems in Natural Language Understanding, where the task is to read and comprehend a given text passage, and then answer questions based on it.

Source: Making Neural Machine Reading Comprehension Faster

Papers

Showing 125 of 555 papers

TitleStatusHype
Pre-Training with Whole Word Masking for Chinese BERTCode3
The Belebele Benchmark: a Parallel Reading Comprehension Dataset in 122 Language VariantsCode2
MiniRBT: A Two-stage Distilled Small Chinese Pre-trained ModelCode2
CLUE: A Chinese Language Understanding Evaluation BenchmarkCode2
Multi-Grained Query-Guided Set Prediction Network for Grounded Multimodal Named Entity RecognitionCode1
ArabicaQA: A Comprehensive Dataset for Arabic Question AnsweringCode1
ChroniclingAmericaQA: A Large-scale Question Answering Dataset based on Historical American Newspaper PagesCode1
Mirror: A Universal Framework for Various Information Extraction TasksCode1
MPrompt: Exploring Multi-level Prompt Tuning for Machine Reading ComprehensionCode1
IDOL: Indicator-oriented Logic Pre-training for Logical ReasoningCode1
Sentence-level Event Detection without Triggers via Prompt Learning and Machine Reading ComprehensionCode1
Modeling Hierarchical Reasoning Chains by Linking Discourse Units and Key Phrases for Reading ComprehensionCode1
NorQuAD: Norwegian Question Answering DatasetCode1
Context-faithful Prompting for Large Language ModelsCode1
Orca: A Few-shot Benchmark for Chinese Conversational Machine Reading ComprehensionCode1
GENIUS: Sketch-based Language Model Pre-training via Extreme and Selective Masking for Text Generation and AugmentationCode1
NEREL-BIO: A Dataset of Biomedical Abstracts Annotated with Nested Named EntitiesCode1
Multitask Pre-training of Modular Prompt for Chinese Few-Shot LearningCode1
A Multi-turn Machine Reading Comprehension Framework with Rethink Mechanism for Emotion-Cause Pair ExtractionCode1
A Robustly Optimized BMRC for Aspect Sentiment Triplet ExtractionCode1
End-to-End Chinese Speaker IdentificationCode1
FinBERT-MRC: financial named entity recognition using BERT under the machine reading comprehension paradigmCode1
Logiformer: A Two-Branch Graph Transformer Network for Interpretable Logical ReasoningCode1
Learning Disentangled Semantic Representations for Zero-Shot Cross-Lingual Transfer in Multilingual Machine Reading ComprehensionCode1
AdaLoGN: Adaptive Logic Graph Network for Reasoning-Based Machine Reading ComprehensionCode1
Show:102550
← PrevPage 1 of 23Next →

No leaderboard results yet.