SOTAVerified

Neural Bi-Lexicalized PCFG Induction

2021-05-31ACL 2021Code Available0· sign in to hype

Songlin Yang, Yanpeng Zhao, Kewei Tu

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Neural lexicalized PCFGs (L-PCFGs) have been shown effective in grammar induction. However, to reduce computational complexity, they make a strong independence assumption on the generation of the child word and thus bilexical dependencies are ignored. In this paper, we propose an approach to parameterize L-PCFGs without making implausible independence assumptions. Our approach directly models bilexical dependencies and meanwhile reduces both learning and representation complexities of L-PCFGs. Experimental results on the English WSJ dataset confirm the effectiveness of our approach in improving both running speed and unsupervised parsing performance.

Tasks

Reproductions