SOTAVerified

Discovering Non-Monotonic Autoregressive Ordering for Text Generation Models using Sinkhorn Distributions

2022-01-17ICLR Track Blog 2022Unverified0· sign in to hype

Anonymous

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In this blog, we discuss an important but less researched topic - Discovering non-monotonic orderings for guiding models to obtain high-quality texts. We specifically discuss the model proposed in the ICLR 2021 paper by Li 2021. This model uses Gumbel-Sinkhorn distributions to assist a decoder model by providing good-quality generation orders during training. The trained models help in generating high-quality outputs for four important NLG tasks: (a) Image Captioning (b) Code Generation (c) Text Summarization and (d) Machine Translation. Interestingly, the model behavior replicates human behavior in some sense - Considering what to write about, before figuring out how to write about it.

Tasks

Reproductions