TENT: Text Classification Based on ENcoding Tree Learning
Anonymous
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
Text classification is a primary task in natural language processing (NLP). Recently, graph neural networks (GNNs) have developed rapidly and been applied to text classification tasks. Although more complex models tend to achieve better performance, research highly depends on the computing power of the device used. In this article, we propose TENT to obtain better text classification performance and reduce the reliance on computing power. Specifically, we first establish a dependency analysis graph for each text and then convert each graph into its corresponding encoding tree. Finally, the representation of each text is obtained by Encoding Tree Learning (ETL), which is based on the form of encoding tree and has low requirement for computing power. Experimental results show that our method outperforms other baselines on several datasets while having a simple structure and few parameters.