SOTAVerified

Neural Machine Translation and Sequence-to-sequence Models: A Tutorial

2017-03-05Code Available0· sign in to hype

Graham Neubig

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

This tutorial introduces a new and powerful set of techniques variously called "neural machine translation" or "neural sequence-to-sequence models". These techniques have been used in a number of tasks regarding the handling of human language, and can be a powerful tool in the toolbox of anyone who wants to model sequential data of some sort. The tutorial assumes that the reader knows the basics of math and programming, but does not assume any particular experience with neural networks or natural language processing. It attempts to explain the intuition behind the various methods covered, then delves into them with enough mathematical detail to understand them concretely, and culiminates with a suggestion for an implementation exercise, where readers can test that they understood the content in practice.

Tasks

Reproductions