Hierarchical Neural Story Generation
Angela Fan, Mike Lewis, Yann Dauphin
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/pytorch/fairseqOfficialIn paperpytorch★ 32,198
- github.com/facebookresearch/fairseqpytorch★ 32,199
- github.com/yangkevin2/emnlp22-re3-story-generationpytorch★ 255
- github.com/Advancing-Machine-Human-Reasoning-Lab/aptpytorch★ 25
- github.com/AnirudhMukherjee/story-generationtf★ 0
- github.com/pumpkinman008/StoryGenerationtf★ 0
- github.com/Kickflip89/NLP_Projectpytorch★ 0
Abstract
We explore story generation: creative systems that can build coherent and fluent passages of text about a topic. We collect a large dataset of 300K human-written stories paired with writing prompts from an online forum. Our dataset enables hierarchical story generation, where the model first generates a premise, and then transforms it into a passage of text. We gain further improvements with a novel form of model fusion that improves the relevance of the story to the prompt, and adding a new gated multi-scale self-attention mechanism to model long-range context. Experiments show large improvements over strong baselines on both automated and human evaluations. Human judges prefer stories generated by our approach to those from a strong non-hierarchical model by a factor of two to one.