SOTAVerified

Question Answering Using Hierarchical Attention on Top of BERT Features

2019-11-01WS 2019Unverified0· sign in to hype

Reham Osama, Nagwa El-Makky, Marwan Torki

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

The model submitted works as follows. When supplied a question and a passage it makes use of the BERT embedding along with the hierarchical attention model which consists of 2 parts, the co-attention and the self-attention, to locate a continuous span of the passage that is the answer to the question.

Tasks

Reproductions