Improved Paraphrase Generation via Controllable Latent Diffusion
Wei Zou, Ziyuan Zhuang, Xiang Geng, ShuJian Huang, Jia Liu, Jiajun Chen
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/nil-zhuang/ld4pgOfficialIn paperpytorch★ 3
Abstract
Paraphrase generation strives to generate high-quality and diverse expressions of a given text, a domain where diffusion models excel. Though SOTA diffusion generation reconciles generation quality and diversity, textual diffusion suffers from a truncation issue that hinders efficiency and quality control. In this work, we propose Latent Diffusion Paraphraser~(LDP), a novel paraphrase generation by modeling a controllable diffusion process given a learned latent space. LDP achieves superior generation efficiency compared to its diffusion counterparts. It can facilitate only input segments to ensure paraphrase semantics, improving the results without external features. Experiments show that LDP better reconciles paraphrase generation quality and diversity than baselines. Further analysis shows that our method is also helpful to other similar text generations and domain adaptations