SOTAVerified

Does Chinese BERT Encode Word Structure?

2020-10-15COLING 2020Code Available0· sign in to hype

Yile Wang, Leyang Cui, Yue Zhang

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Contextualized representations give significantly improved results for a wide range of NLP tasks. Much work has been dedicated to analyzing the features captured by representative models such as BERT. Existing work finds that syntactic, semantic and word sense knowledge are encoded in BERT. However, little work has investigated word features for character-based languages such as Chinese. We investigate Chinese BERT using both attention weight distribution statistics and probing tasks, finding that (1) word information is captured by BERT; (2) word-level features are mostly in the middle representation layers; (3) downstream tasks make different use of word features in BERT, with POS tagging and chunking relying the most on word features, and natural language inference relying the least on such features.

Tasks

Reproductions