SOTAVerified

Machine Reading Comprehension

Machine Reading Comprehension is one of the key problems in Natural Language Understanding, where the task is to read and comprehend a given text passage, and then answer questions based on it.

Source: Making Neural Machine Reading Comprehension Faster

Papers

Showing 376400 of 555 papers

TitleStatusHype
Improving the Robustness of Deep Reading Comprehension Models by Leveraging Syntax Prior0
Improving Zero-Shot Event Extraction via Sentence Simplification0
Incorporating Connections Beyond Knowledge Embeddings: A Plug-and-Play Module to Enhance Commonsense Reasoning in Machine Reading Comprehension0
Incorporating Relation Knowledge into Commonsense Reading Comprehension with Multi-task Learning0
Incorporating Syntax and Frame Semantics in Neural Network for Machine Reading Comprehension0
Increasing the Difficulty of Automatically Generated Questions via Reinforcement Learning with Synthetic Preference0
Information Extraction from Documents: Question Answering vs Token Classification in real-world setups0
Inspecting Unification of Encoding and Matching with Transformer: A Case Study of Machine Reading Comprehension0
Integrated Triaging for Fast Reading Comprehension0
Integrating a Heterogeneous Graph with Entity-aware Self-attention using Relative Position Labels for Reading Comprehension Model0
Integrating Semantic Information into Sketchy Reading Module of Retro-Reader for Vietnamese Machine Reading Comprehension0
Interpretable Semantic Role Relation Table for Supporting Facts Recognition of Reading Comprehension0
Interpretable Traces, Unexpected Outcomes: Investigating the Disconnect in Trace-Based Knowledge Distillation0
Interpreting Attention Models with Human Visual Attention in Machine Reading Comprehension0
Interpreting Attention Models with Human Visual Attention in Machine Reading Comprehension0
Investigating a Benchmark for Training-set free Evaluation of Linguistic Capabilities in Machine Reading Comprehension0
Investigating Recent Large Language Models for Vietnamese Machine Reading Comprehension0
Jiangnan at SemEval-2018 Task 11: Deep Neural Network with Attention Method for Machine Comprehension Task0
基於BERT模型之多國語言機器閱讀理解研究(Multilingual Machine Reading Comprehension based on BERT Model)0
基于话头话体共享结构信息的机器阅读理解研究(Rearch on Machine reading comprehension based on shared structure information between Naming and Telling)0
基于相似度进行句子选择的机器阅读理解数据增强(Machine reading comprehension data Augmentation for sentence selection based on similarity)0
基于小句复合体的中文机器阅读理解研究(Machine Reading Comprehension Based on Clause Complex)0
KECP: Knowledge Enhanced Contrastive Prompting for Few-shot Extractive Question Answering0
Keyword-based Query Comprehending via Multiple Optimized-Demand Augmentation0
KgPLM: Knowledge-guided Language Model Pre-training via Generative and Discriminative Learning0
Show:102550
← PrevPage 16 of 23Next →

No leaderboard results yet.