ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2302.13752
  4. Cited By
A Brief Survey on the Approximation Theory for Sequence Modelling

A Brief Survey on the Approximation Theory for Sequence Modelling

27 February 2023
Hao Jiang
Qianxiao Li
Zhong Li
Shida Wang
    AI4TS
ArXivPDFHTML

Papers citing "A Brief Survey on the Approximation Theory for Sequence Modelling"

15 / 15 papers shown
Title
State-space systems as dynamic generative models
State-space systems as dynamic generative models
Juan-Pablo Ortega
Florian Rossmannek
61
1
0
13 Mar 2025
Fading memory and the convolution theorem
Fading memory and the convolution theorem
Juan-Pablo Ortega
Florian Rossmannek
36
0
0
14 Aug 2024
LongSSM: On the Length Extension of State-space Models in Language
  Modelling
LongSSM: On the Length Extension of State-space Models in Language Modelling
Shida Wang
30
0
0
04 Jun 2024
Rethinking Transformers in Solving POMDPs
Rethinking Transformers in Solving POMDPs
Chenhao Lu
Ruizhe Shi
Yuyao Liu
Kaizhe Hu
Simon S. Du
Huazhe Xu
AI4CE
27
2
0
27 May 2024
Prompting a Pretrained Transformer Can Be a Universal Approximator
Prompting a Pretrained Transformer Can Be a Universal Approximator
Aleksandar Petrov
Philip H. S. Torr
Adel Bibi
26
11
0
22 Feb 2024
A mathematical perspective on Transformers
A mathematical perspective on Transformers
Borjan Geshkovski
Cyril Letrouit
Yury Polyanskiy
Philippe Rigollet
EDL
AI4CE
40
36
0
17 Dec 2023
StableSSM: Alleviating the Curse of Memory in State-space Models through
  Stable Reparameterization
StableSSM: Alleviating the Curse of Memory in State-space Models through Stable Reparameterization
Shida Wang
Qianxiao Li
19
13
0
24 Nov 2023
Parrot Mind: Towards Explaining the Complex Task Reasoning of Pretrained
  Large Language Models with Template-Content Structure
Parrot Mind: Towards Explaining the Complex Task Reasoning of Pretrained Large Language Models with Template-Content Structure
Haotong Yang
Fanxu Meng
Zhouchen Lin
Muhan Zhang
LRM
23
2
0
09 Oct 2023
State-space Models with Layer-wise Nonlinearity are Universal
  Approximators with Exponential Decaying Memory
State-space Models with Layer-wise Nonlinearity are Universal Approximators with Exponential Decaying Memory
Shida Wang
Beichen Xue
19
23
0
23 Sep 2023
Improve Long-term Memory Learning Through Rescaling the Error Temporally
Improve Long-term Memory Learning Through Rescaling the Error Temporally
Shida Wang
Zhanglu Yan
19
1
0
21 Jul 2023
Inverse Approximation Theory for Nonlinear Recurrent Neural Networks
Inverse Approximation Theory for Nonlinear Recurrent Neural Networks
Shida Wang
Zhong Li
Qianxiao Li
22
7
0
30 May 2023
Forward and Inverse Approximation Theory for Linear Temporal
  Convolutional Networks
Forward and Inverse Approximation Theory for Linear Temporal Convolutional Networks
Hao Jiang
Qianxiao Li
AI4TS
8
0
0
29 May 2023
Small Transformers Compute Universal Metric Embeddings
Small Transformers Compute Universal Metric Embeddings
Anastasis Kratsios
Valentin Debarnot
Ivan Dokmanić
54
11
0
14 Sep 2022
Universal Approximation Under Constraints is Possible with Transformers
Universal Approximation Under Constraints is Possible with Transformers
Anastasis Kratsios
Behnoosh Zamanlooy
Tianlin Liu
Ivan Dokmanić
51
26
0
07 Oct 2021
On the Provable Generalization of Recurrent Neural Networks
On the Provable Generalization of Recurrent Neural Networks
Lifu Wang
Bo Shen
Bo Hu
Xing Cao
31
8
0
29 Sep 2021
1