Wojciech Zaremba* Ilya Sutskever Oriol Vinyals

Abstract
We present a simple regularization technique for Recurrent Neural Networks (RNNs) with Long Short-Term Memory (LSTM) units. Dropout, the most successful technique for regularizing neural networks, does not work well with RNNs and LSTMs. In this paper, we show how to correctly apply dropout to LSTMs, and show that it substantially reduces overfitting on a variety of tasks. These tasks include language modeling, speech recognition, image caption generation, and machine translation.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| language-modelling-on-penn-treebank-word | Zaremba et al. (2014) - LSTM (large) | Test perplexity: 78.4 Validation perplexity: 82.2 |
| language-modelling-on-penn-treebank-word | Zaremba et al. (2014) - LSTM (medium) | Test perplexity: 82.7 Validation perplexity: 86.2 |
| machine-translation-on-wmt2014-english-french | Regularized LSTM | BLEU score: 29.03 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.