ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2305.19190
  4. Cited By
Inverse Approximation Theory for Nonlinear Recurrent Neural Networks
v1v2v3v4 (latest)

Inverse Approximation Theory for Nonlinear Recurrent Neural Networks

International Conference on Learning Representations (ICLR), 2023
30 May 2023
Shida Wang
Zhong Li
Qianxiao Li
ArXiv (abs)PDFHTMLHuggingFace (1 upvotes)Github (18★)

Papers citing "Inverse Approximation Theory for Nonlinear Recurrent Neural Networks"

9 / 9 papers shown
Title
Understanding Input Selectivity in Mamba: Impact on Approximation Power, Memorization, and Associative Recall Capacity
Understanding Input Selectivity in Mamba: Impact on Approximation Power, Memorization, and Associative Recall Capacity
Ningyuan Huang
Miguel Sarabia
Abhinav Moudgil
P. Rodríguez
Luca Zappella
Federico Danieli
Mamba
181
1
0
13 Jun 2025
Numerical Investigation of Sequence Modeling Theory using Controllable Memory Functions
Numerical Investigation of Sequence Modeling Theory using Controllable Memory Functions
Haotian Jiang
Zeyu Bao
Shida Wang
Qianxiao Li
280
1
0
06 Jun 2025
LongSSM: On the Length Extension of State-space Models in Language
  Modelling
LongSSM: On the Length Extension of State-space Models in Language Modelling
Shida Wang
216
4
0
04 Jun 2024
Recurrent neural networks: vanishing and exploding gradients are not the
  end of the story
Recurrent neural networks: vanishing and exploding gradients are not the end of the story
Nicolas Zucchet
Antonio Orvieto
ODLAAML
267
40
0
31 May 2024
From Generalization Analysis to Optimization Designs for State Space
  Models
From Generalization Analysis to Optimization Designs for State Space ModelsInternational Conference on Machine Learning (ICML), 2024
Fusheng Liu
Qianxiao Li
181
10
0
04 May 2024
StableSSM: Alleviating the Curse of Memory in State-space Models through
  Stable Reparameterization
StableSSM: Alleviating the Curse of Memory in State-space Models through Stable ReparameterizationInternational Conference on Machine Learning (ICML), 2023
Shida Wang
Qianxiao Li
371
22
0
24 Nov 2023
State-space Models with Layer-wise Nonlinearity are Universal
  Approximators with Exponential Decaying Memory
State-space Models with Layer-wise Nonlinearity are Universal Approximators with Exponential Decaying MemoryNeural Information Processing Systems (NeurIPS), 2023
Shida Wang
Beichen Xue
290
36
0
23 Sep 2023
HyperSNN: A new efficient and robust deep learning model for resource
  constrained control applications
HyperSNN: A new efficient and robust deep learning model for resource constrained control applications
Zhanglu Yan
Shida Wang
Kaiwen Tang
Wong-Fai Wong
130
2
0
16 Aug 2023
Improve Long-term Memory Learning Through Rescaling the Error Temporally
Improve Long-term Memory Learning Through Rescaling the Error Temporally
Shida Wang
Zhanglu Yan
178
1
0
21 Jul 2023
1