SOTAVerified

Machine Reading Comprehension

Machine Reading Comprehension is one of the key problems in Natural Language Understanding, where the task is to read and comprehend a given text passage, and then answer questions based on it.

Source: Making Neural Machine Reading Comprehension Faster

Papers

Showing 501525 of 555 papers

TitleStatusHype
G4: Grounding-guided Goal-oriented Dialogues Generation with Multiple Documents0
GAAMA 2.0: An Integrated System that Answers Boolean and Extractive Questions0
Generative Large Language Models Are All-purpose Text Analytics Engines: Text-to-text Learning Is All Your Need0
GenNet : Reading Comprehension with Multiple Choice Questions using Generation and Selection model0
Graph-Based Knowledge Integration for Question Answering over Dialogue0
Graph-combined Coreference Resolution Methods on Conversational Machine Reading Comprehension with Pre-trained Language Model0
Graph Sequential Network for Reasoning over Sequences0
Have You Seen That Number? Investigating Extrapolation in Question Answering Models0
ClueReader: Heterogeneous Graph Attention Network for Multi-hop Machine Reading Comprehension0
Hierarchical Evaluation Framework: Best Practices for Human Evaluation0
HRCA+: Advanced Multiple-choice Machine Reading Comprehension Method0
Explicit Contextual Semantics for Text Comprehension0
Improved Synthetic Training for Reading Comprehension0
Improve Neural Entity Recognition via Multi-Task Data Selection and Constrained Decoding0
Improving Cross-Lingual Reading Comprehension with Self-Training0
Improving Machine Reading Comprehension via Adversarial Training0
Improving Machine Reading Comprehension with Contextualized Commonsense Knowledge0
Improving Machine Reading Comprehension with Single-choice Decision and Transfer Learning0
Improving Opinion-based Question Answering Systems Through Label Error Detection and Overwrite0
Improving Pre-Trained Multilingual Models with Vocabulary Expansion0
Improving Pre-Trained Multilingual Model with Vocabulary Expansion0
Improving the Robustness of Deep Reading Comprehension Models by Leveraging Syntax Prior0
Improving Zero-Shot Event Extraction via Sentence Simplification0
Incorporating Connections Beyond Knowledge Embeddings: A Plug-and-Play Module to Enhance Commonsense Reasoning in Machine Reading Comprehension0
Incorporating Relation Knowledge into Commonsense Reading Comprehension with Multi-task Learning0
Show:102550
← PrevPage 21 of 23Next →

No leaderboard results yet.