ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2308.12585
  4. Cited By
Persistent learning signals and working memory without continuous
  attractors

Persistent learning signals and working memory without continuous attractors

24 August 2023
Il Memming Park
Ábel Ságodi
Piotr Sokól
ArXivPDFHTML

Papers citing "Persistent learning signals and working memory without continuous attractors"

6 / 6 papers shown
Title
Improving World Models using Deep Supervision with Linear Probes
Improving World Models using Deep Supervision with Linear Probes
Andrii Zahorodnii
26
0
0
04 Apr 2025
Back to the Continuous Attractor
Back to the Continuous Attractor
Ábel Ságodi
Guillermo Martín-Sánchez
Piotr Sokól
Il Memming Park
19
2
0
31 Jul 2024
Recurrent neural networks: vanishing and exploding gradients are not the
  end of the story
Recurrent neural networks: vanishing and exploding gradients are not the end of the story
Nicolas Zucchet
Antonio Orvieto
ODL
AAML
40
9
0
31 May 2024
Gradient Flossing: Improving Gradient Descent through Dynamic Control of
  Jacobians
Gradient Flossing: Improving Gradient Descent through Dynamic Control of Jacobians
Rainer Engelken
13
5
0
28 Dec 2023
Resurrecting Recurrent Neural Networks for Long Sequences
Resurrecting Recurrent Neural Networks for Long Sequences
Antonio Orvieto
Samuel L. Smith
Albert Gu
Anushan Fernando
Çağlar Gülçehre
Razvan Pascanu
Soham De
88
265
0
11 Mar 2023
On the difficulty of learning chaotic dynamics with RNNs
On the difficulty of learning chaotic dynamics with RNNs
Jonas M. Mikhaeil
Zahra Monfared
Daniel Durstewitz
57
50
0
14 Oct 2021
1