SOTAVerified

Machine Reading Comprehension

Machine Reading Comprehension is one of the key problems in Natural Language Understanding, where the task is to read and comprehend a given text passage, and then answer questions based on it.

Source: Making Neural Machine Reading Comprehension Faster

Papers

Showing 501550 of 555 papers

TitleStatusHype
G4: Grounding-guided Goal-oriented Dialogues Generation with Multiple Documents0
GAAMA 2.0: An Integrated System that Answers Boolean and Extractive Questions0
Generative Large Language Models Are All-purpose Text Analytics Engines: Text-to-text Learning Is All Your Need0
GenNet : Reading Comprehension with Multiple Choice Questions using Generation and Selection model0
Graph-Based Knowledge Integration for Question Answering over Dialogue0
Graph-combined Coreference Resolution Methods on Conversational Machine Reading Comprehension with Pre-trained Language Model0
Graph Sequential Network for Reasoning over Sequences0
Have You Seen That Number? Investigating Extrapolation in Question Answering Models0
ClueReader: Heterogeneous Graph Attention Network for Multi-hop Machine Reading Comprehension0
Hierarchical Evaluation Framework: Best Practices for Human Evaluation0
HRCA+: Advanced Multiple-choice Machine Reading Comprehension Method0
Explicit Contextual Semantics for Text Comprehension0
Improved Synthetic Training for Reading Comprehension0
Improve Neural Entity Recognition via Multi-Task Data Selection and Constrained Decoding0
Improving Cross-Lingual Reading Comprehension with Self-Training0
Improving Machine Reading Comprehension via Adversarial Training0
Improving Machine Reading Comprehension with Contextualized Commonsense Knowledge0
Improving Machine Reading Comprehension with Single-choice Decision and Transfer Learning0
Improving Opinion-based Question Answering Systems Through Label Error Detection and Overwrite0
Improving Pre-Trained Multilingual Models with Vocabulary Expansion0
Improving Pre-Trained Multilingual Model with Vocabulary Expansion0
Improving the Robustness of Deep Reading Comprehension Models by Leveraging Syntax Prior0
Improving Zero-Shot Event Extraction via Sentence Simplification0
Incorporating Connections Beyond Knowledge Embeddings: A Plug-and-Play Module to Enhance Commonsense Reasoning in Machine Reading Comprehension0
Incorporating Relation Knowledge into Commonsense Reading Comprehension with Multi-task Learning0
Incorporating Syntax and Frame Semantics in Neural Network for Machine Reading Comprehension0
Increasing the Difficulty of Automatically Generated Questions via Reinforcement Learning with Synthetic Preference0
Information Extraction from Documents: Question Answering vs Token Classification in real-world setups0
Inspecting Unification of Encoding and Matching with Transformer: A Case Study of Machine Reading Comprehension0
Integrated Triaging for Fast Reading Comprehension0
Integrating a Heterogeneous Graph with Entity-aware Self-attention using Relative Position Labels for Reading Comprehension Model0
Integrating Semantic Information into Sketchy Reading Module of Retro-Reader for Vietnamese Machine Reading Comprehension0
Interpretable Semantic Role Relation Table for Supporting Facts Recognition of Reading Comprehension0
Interpretable Traces, Unexpected Outcomes: Investigating the Disconnect in Trace-Based Knowledge Distillation0
Interpreting Attention Models with Human Visual Attention in Machine Reading Comprehension0
Interpreting Attention Models with Human Visual Attention in Machine Reading Comprehension0
Investigating a Benchmark for Training-set free Evaluation of Linguistic Capabilities in Machine Reading Comprehension0
Investigating Recent Large Language Models for Vietnamese Machine Reading Comprehension0
Jiangnan at SemEval-2018 Task 11: Deep Neural Network with Attention Method for Machine Comprehension Task0
基於BERT模型之多國語言機器閱讀理解研究(Multilingual Machine Reading Comprehension based on BERT Model)0
基于话头话体共享结构信息的机器阅读理解研究(Rearch on Machine reading comprehension based on shared structure information between Naming and Telling)0
基于相似度进行句子选择的机器阅读理解数据增强(Machine reading comprehension data Augmentation for sentence selection based on similarity)0
基于小句复合体的中文机器阅读理解研究(Machine Reading Comprehension Based on Clause Complex)0
KECP: Knowledge Enhanced Contrastive Prompting for Few-shot Extractive Question Answering0
Keyword-based Query Comprehending via Multiple Optimized-Demand Augmentation0
KgPLM: Knowledge-guided Language Model Pre-training via Generative and Discriminative Learning0
KILDST: Effective Knowledge-Integrated Learning for Dialogue State Tracking using Gazetteer and Speaker Information0
Knowledge Based Machine Reading Comprehension0
Know your tools well: Better and faster QA with synthetic examples0
KorQuAD1.0: Korean QA Dataset for Machine Reading Comprehension0
Show:102550
← PrevPage 11 of 12Next →

No leaderboard results yet.