SOTAVerified

ChatGPT is a Potential Zero-Shot Dependency Parser

2023-10-25Unverified0· sign in to hype

Boda Lin, Xinyi Zhou, Binghao Tang, Xiaocheng Gong, Si Li

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Pre-trained language models have been widely used in dependency parsing task and have achieved significant improvements in parser performance. However, it remains an understudied question whether pre-trained language models can spontaneously exhibit the ability of dependency parsing without introducing additional parser structure in the zero-shot scenario. In this paper, we propose to explore the dependency parsing ability of large language models such as ChatGPT and conduct linguistic analysis. The experimental results demonstrate that ChatGPT is a potential zero-shot dependency parser, and the linguistic analysis also shows some unique preferences in parsing outputs.

Tasks

Reproductions