Generating Sequences With Recurrent Neural Networks
2013-08-04Code Available1· sign in to hype
Alex Graves
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/szcom/rnnlibnone★ 904
- github.com/grzego/handwriting-generationtf★ 586
- github.com/hardmaru/sketch-rnn-datasetstf★ 220
- github.com/cody2007/arcane_fortunenone★ 28
- github.com/altsoph/paranoid_transformerpytorch★ 19
- github.com/robertknight/textgenpytorch★ 4
- github.com/CambridgeIIS/Gesture-Keyboard-Traj-Gentf★ 3
- github.com/feiwu77777/Handwriting_generationtf★ 2
- gitlab.com/raymondhs/char-rnn-truecasetorch★ 0
- github.com/sherjilozair/char-rnn-tensorflowtf★ 0
Abstract
This paper shows how Long Short-term Memory recurrent neural networks can be used to generate complex sequences with long-range structure, simply by predicting one data point at a time. The approach is demonstrated for text (where the data are discrete) and online handwriting (where the data are real-valued). It is then extended to handwriting synthesis by allowing the network to condition its predictions on a text sequence. The resulting system is able to generate highly realistic cursive handwriting in a wide variety of styles.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| enwik8 | LSTM (7 layers) | Bit per Character (BPC) | 1.67 | — | Unverified |