ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.17003
0
0

Sufficient conditions for offline reactivation in recurrent neural networks

22 May 2025
Nanda H. Krishna
Colin Bredenberg
Daniel Levenstein
Blake A. Richards
Guillaume Lajoie
    OffRL
ArXivPDFHTML
Abstract

During periods of quiescence, such as sleep, neural activity in many brain circuits resembles that observed during periods of task engagement. However, the precise conditions under which task-optimized networks can autonomously reactivate the same network states responsible for online behavior is poorly understood. In this study, we develop a mathematical framework that outlines sufficient conditions for the emergence of neural reactivation in circuits that encode features of smoothly varying stimuli. We demonstrate mathematically that noisy recurrent networks optimized to track environmental state variables using change-based sensory information naturally develop denoising dynamics, which, in the absence of input, cause the network to revisit state configurations observed during periods of online activity. We validate our findings using numerical experiments on two canonical neuroscience tasks: spatial position estimation based on self-motion cues, and head direction estimation based on angular velocity cues. Overall, our work provides theoretical support for modeling offline reactivation as an emergent consequence of task optimization in noisy neural circuits.

View on arXiv
@article{krishna2025_2505.17003,
  title={ Sufficient conditions for offline reactivation in recurrent neural networks },
  author={ Nanda H. Krishna and Colin Bredenberg and Daniel Levenstein and Blake A. Richards and Guillaume Lajoie },
  journal={arXiv preprint arXiv:2505.17003},
  year={ 2025 }
}
Comments on this paper