SOTAVerified

Surface Realization Using Pretrained Language Models

2020-12-01MSR (COLING) 2020Unverified0· sign in to hype

Farhood Farahnak, Laya Rafiee, Leila Kosseim, Thomas Fevens

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In the context of Natural Language Generation, surface realization is the task of generating the linear form of a text following a given grammar. Surface realization models usually consist of a cascade of complex sub-modules, either rule-based or neural network-based, each responsible for a specific sub-task. In this work, we show that a single encoder-decoder language model can be used in an end-to-end fashion for all sub-tasks of surface realization. The model is designed based on the BART language model that receives a linear representation of unordered and non-inflected tokens in a sentence along with their corresponding Universal Dependency information and produces the linear sequence of inflected tokens along with the missing words. The model was evaluated on the shallow and deep tracks of the 2020 Surface Realization Shared Task (SR’20) using both human and automatic evaluation. The results indicate that despite its simplicity, our model achieves competitive results among all participants in the shared task.

Tasks

Reproductions