ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1605.01988
174
75
v1v2v3 (latest)

LSTM with Working Memory

6 May 2016
Andrew Pulver
Siwei Lyu
    RALMKELM
ArXiv (abs)PDFHTML
Abstract

LSTM is arguably the most successful RNN architecture for many tasks that involve sequential information. In the past few years there have been several proposed improvements to LSTM. We propose an improvement to LSTM which allows communication between memory cells in different blocks and allows an LSTM layer to carry out internal computation within its memory.

View on arXiv
Comments on this paper