SOTAVerified

HUJI-KU at MRP~2020: Two Transition-based Neural Parsers

2020-10-12Unverified0· sign in to hype

Ofir Arviv, Ruixiang Cui, Daniel Hershcovich

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

This paper describes the HUJI-KU system submission to the shared task on Cross-Framework Meaning Representation Parsing (MRP) at the 2020 Conference for Computational Language Learning (CoNLL), employing TUPA and the HIT-SCIR parser, which were, respectively, the baseline system and winning system in the 2019 MRP shared task. Both are transition-based parsers using BERT contextualized embeddings. We generalized TUPA to support the newly-added MRP frameworks and languages, and experimented with multitask learning with the HIT-SCIR parser. We reached 4th place in both the cross-framework and cross-lingual tracks.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
AMR (chinese, MRP 2020)HUJI-KUF145Unverified
AMR (english, MRP 2020)HUJI-KUF152Unverified
DRG (english, MRP 2020)HUJI-KUF163Unverified
DRG (german, MRP 2020)HUJI-KUF162Unverified
EDS (english, MRP 2020)HUJI-KUF180Unverified
PTG (czech, MRP 2020)HUJI-KUF158Unverified
PTG (english, MRP 2020)HUJI-KUF154Unverified
UCCA (english, MRP 2020)HUJI-KUF173Unverified
UCCA (german, MRP 2020)HUJI-KUF175Unverified

Reproductions