Multi-Task Attentive Residual Networks for Argument Mining
Andrea Galassi, Marco Lippi, Paolo Torroni
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/AGalassi/StructurePrediction18OfficialIn papertf★ 11
Abstract
We explore the use of residual networks and neural attention for multiple argument mining tasks. We propose a residual architecture that exploits attention, multi-task learning, and makes use of ensemble, without any assumption on document or argument structure. We present an extensive experimental evaluation on five different corpora of user-generated comments, scientific publications, and persuasive essays. Our results show that our approach is a strong competitor against state-of-the-art architectures with a higher computational footprint or corpus-specific design, representing an interesting compromise between generality, performance accuracy and reduced model size.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| AbstRCT - Neoplasm | ResAttArg | F1 | 54.43 | — | Unverified |
| CDCP | ResAttArg | F1 | 29.73 | — | Unverified |
| DRI Corpus | ResAttArg | F1 | 43.66 | — | Unverified |