SOTAVerified

Syntax Representation in Word Embeddings and Neural Networks -- A Survey

2020-10-02Unverified0· sign in to hype

Tomasz Limisiewicz, David Mareček

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Neural networks trained on natural language processing tasks capture syntax even though it is not provided as a supervision signal. This indicates that syntactic analysis is essential to the understating of language in artificial intelligence systems. This overview paper covers approaches of evaluating the amount of syntactic information included in the representations of words for different neural network architectures. We mainly summarize re-search on English monolingual data on language modeling tasks and multilingual data for neural machine translation systems and multilingual language models. We describe which pre-trained models and representations of language are best suited for transfer to syntactic tasks.

Tasks

Reproductions