SOTAVerified

Word Order and World Knowledge

2024-03-01Code Available0· sign in to hype

Qinghua Zhao, Vinit Ravishankar, Nicolas Garneau, Anders Søgaard

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Word order is an important concept in natural language, and in this work, we study how word order affects the induction of world knowledge from raw text using language models. We use word analogies to probe for such knowledge. Specifically, in addition to the natural word order, we first respectively extract texts of six fixed word orders from five languages and then pretrain the language models on these texts. Finally, we analyze the experimental results of the fixed word orders on word analogies and show that i) certain fixed word orders consistently outperform or underperform others, though the specifics vary across languages, and ii) the Wov2Lex hypothesis is not hold in pre-trained language models, and the natural word order typically yields mediocre results. The source code will be made publicly available at https://github.com/lshowway/probing_by_analogy.

Tasks

Reproductions