SOTAVerified

Abusive Language Detection using Syntactic Dependency Graphs

2020-11-01EMNLP (ALW) 2020Unverified0· sign in to hype

Kanika Narang, Chris Brew

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Automated detection of abusive language online has become imperative. Current sequential models (LSTM) do not work well for long and complex sentences while bi-transformer models (BERT) are not computationally efficient for the task. We show that classifiers based on syntactic structure of the text, dependency graphical convolutional networks (DepGCNs) can achieve state-of-the-art performance on abusive language datasets. The overall performance is at par with of strong baselines such as fine-tuned BERT. Further, our GCN-based approach is much more efficient than BERT at inference time making it suitable for real-time detection.

Tasks

Reproductions