SOTAVerified

Text Level Graph Neural Network for Text Classification

2019-10-06IJCNLP 2019Code Available0· sign in to hype

Lianzhe Huang, Dehong Ma, Sujian Li, Xiaodong Zhang, Houfeng Wang

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Recently, researches have explored the graph neural network (GNN) techniques on text classification, since GNN does well in handling complex structures and preserving global information. However, previous methods based on GNN are mainly faced with the practical problems of fixed corpus level graph structure which do not support online testing and high memory consumption. To tackle the problems, we propose a new GNN based model that builds graphs for each input text with global parameters sharing instead of a single graph for the whole corpus. This method removes the burden of dependence between an individual text and entire corpus which support online testing, but still preserve global information. Besides, we build graphs by much smaller windows in the text, which not only extract more local features but also significantly reduce the edge numbers as well as memory consumption. Experiments show that our model outperforms existing models on several text classification datasets even with consuming less memory.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
OhsumedOur Model*Accuracy69.4Unverified
R52Our Model*Accuracy94.6Unverified
R8Our Model*Accuracy97.8Unverified

Reproductions