SOTAVerified

TransSGAN: GAN based semi-superivsed learning for text classification with Transformer Encoder

2021-11-16ACL ARR November 2021Unverified0· sign in to hype

Anonymous

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Recent semi-supervised learning for text classification uses data augmentations. However, it is a time-consuming and tricky hyperparametric approach. To overcome this problem, we present GAN-based semi-supervised learning for text classification, TransSGAN, which has a simple architecture, fewer hyperparameters, and trains for less time than current SOTA models since it does not need data augmentation. By adding one transformer encoder block to Semi-Supervised GAN, we can get comparable performances with extremely few labeled data to previous SOTA models up to 1% difference and 25% better performance than baseline GAN based model. Furthermore, we provide an analysis of our model, what the generator makes, and what multi-head self-attention layer in the generator learns. Through this, we can validate that our generator makes qualified data.

Tasks

Reproductions