SOTAVerified

AMR Parsing

Each AMR is a single rooted, directed graph. AMRs include PropBank semantic roles, within-sentence coreference, named entities and types, modality, negation, questions, quantities, and so on. See.

Papers

Showing 110 of 117 papers

TitleStatusHype
Reassessing Graph Linearization for Sequence-to-sequence AMR Parsing: On the Advantages and Limitations of Triple-Based EncodingCode0
Should Cross-Lingual AMR Parsing go Meta? An Empirical Assessment of Meta-Learning and Joint Learning AMR ParsingCode0
Adapting Abstract Meaning Representation Parsing to the Clinical Narrative -- the SPRING THYME parser0
AMR Parsing is Far from Solved: GrAPES, the Granular AMR Parsing Evaluation SuiteCode0
AMR Parsing with Causal Hierarchical Attention and PointersCode0
Guiding AMR Parsing with Reverse Graph LinearizationCode0
Incorporating Graph Information in Transformer-based AMR ParsingCode0
AMRs Assemble! Learning to Ensemble with Autoregressive Models for AMR ParsingCode0
Slide, Constrain, Parse, Repeat: Synchronous SlidingWindows for Document AMR Parsing0
AMR Parsing with Instruction Fine-tuned Pre-trained Language Models0
Show:102550
← PrevPage 1 of 12Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1AMRBART largeSmatch76.9Unverified
2Graphene SmatchSmatch76.32Unverified
3SPRING DFSSmatch73.7Unverified
4SPRING DFS + silverSmatch71.8Unverified