ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1801.10308
32
68

Nested LSTMs

31 January 2018
Joel Ruben Antony Moniz
David M. Krueger
ArXivPDFHTML
Abstract

We propose Nested LSTMs (NLSTM), a novel RNN architecture with multiple levels of memory. Nested LSTMs add depth to LSTMs via nesting as opposed to stacking. The value of a memory cell in an NLSTM is computed by an LSTM cell, which has its own inner memory cell. Specifically, instead of computing the value of the (outer) memory cell as ctouter=ft⊙ct−1+it⊙gtc^{outer}_t = f_t \odot c_{t-1} + i_t \odot g_tctouter​=ft​⊙ct−1​+it​⊙gt​, NLSTM memory cells use the concatenation (ft⊙ct−1,it⊙gt)(f_t \odot c_{t-1}, i_t \odot g_t)(ft​⊙ct−1​,it​⊙gt​) as input to an inner LSTM (or NLSTM) memory cell, and set ctouterc^{outer}_tctouter​ = htinnerh^{inner}_thtinner​. Nested LSTMs outperform both stacked and single-layer LSTMs with similar numbers of parameters in our experiments on various character-level language modeling tasks, and the inner memories of an LSTM learn longer term dependencies compared with the higher-level units of a stacked LSTM.

View on arXiv
Comments on this paper