SOTAVerified

A Zero-Resource Approach to Cross-Lingual Query-Focused Abstractive Summarization

2021-11-16ACL ARR November 2021Unverified0· sign in to hype

Anonymous

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We present a novel approach for cross-lingual query-focused abstractive summarization (QFAS) that leverages the translate-then-summarize paradigm. We approach cross-lingual QFAS as a zero-resource problem and introduce a framework to create a synthetic QFAS corpus from a standard summarization corpus using a novel query-generation strategy. Our model summarizes documents in foreign languages for which translation quality is poor. It learns not only to identify and condense salient information relevant to a query, but also to appropriately rephrase grammatical errors and disfluencies that may occur in the noisy translations. Our technique enhances a pre-trained encoder-decoder transformer by introducing query focus to the encoder. We show that our method for creating synthetic QFAS data leads to more robust models that not only achieve state-of-the-art performance on our corpus, but also perform better on out-of-distribution data as compared to prior work.

Tasks

Reproductions