Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
1906.06349
Cited By
v1
v2 (latest)
On the Computational Power of RNNs
14 June 2019
Samuel A. Korsky
R. Berwick
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"On the Computational Power of RNNs"
33 / 33 papers shown
Title
A Constructive Framework for Nondeterministic Automata via Time-Shared, Depth-Unrolled Feedforward Networks
Sahil Rajesh Dhayalkar
165
0
0
30 May 2025
Fixed-Point RNNs: Interpolating from Diagonal to Dense
Sajad Movahedi
Felix Sarnthein
Nicola Muca Cirone
Antonio Orvieto
326
5
0
13 Mar 2025
Compositional Reasoning with Transformers, RNNs, and Chain of Thought
Gilad Yehudai
Noah Amsel
Joan Bruna
LRM
200
1
0
03 Mar 2025
Autoregressive Large Language Models are Computationally Universal
Dale Schuurmans
Hanjun Dai
Francesco Zanini
140
10
0
04 Oct 2024
On the Representational Capacity of Neural Language Models with Chain-of-Thought Reasoning
Franz Nowak
Anej Svete
Alexandra Butoi
Robert Bamler
ReLM
LRM
280
24
0
20 Jun 2024
Separations in the Representational Capabilities of Transformers and Recurrent Architectures
S. Bhattamishra
Michael Hahn
Phil Blunsom
Varun Kanade
GNN
218
19
0
13 Jun 2024
A Tensor Decomposition Perspective on Second-order RNNs
International Conference on Machine Learning (ICML), 2024
M. Lizaire
Michael Rizvi-Martel
Marawan Gamal Abdel Hameed
Guillaume Rabusseau
242
2
0
07 Jun 2024
What Languages are Easy to Language-Model? A Perspective from Learning Probabilistic Regular Languages
Annual Meeting of the Association for Computational Linguistics (ACL), 2024
Nadav Borenstein
Anej Svete
R. Chan
Josef Valvoda
Franz Nowak
Isabelle Augenstein
Eleanor Chodroff
Robert Bamler
636
19
0
06 Jun 2024
Lower Bounds on the Expressivity of Recurrent Neural Language Models
Anej Svete
Franz Nowak
Anisha Mohamed Sahabdeen
Robert Bamler
222
0
0
29 May 2024
Rethinking Transformers in Solving POMDPs
Chenhao Lu
Ruizhe Shi
Yuyao Liu
Kaizhe Hu
Simon S. Du
Huazhe Xu
AI4CE
330
8
0
27 May 2024
Theoretical Foundations of Deep Selective State-Space Models
Nicola Muca Cirone
Antonio Orvieto
Benjamin Walker
C. Salvi
Terry Lyons
Mamba
542
54
0
29 Feb 2024
On Efficiently Representing Regular Languages as RNNs
Anej Svete
R. Chan
Robert Bamler
247
3
0
24 Feb 2024
Practical Computational Power of Linear Transformers and Their Recurrent and Self-Referential Extensions
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2023
Kazuki Irie
Róbert Csordás
Jürgen Schmidhuber
224
20
0
24 Oct 2023
On the Representational Capacity of Recurrent Neural Language Models
Franz Nowak
Anej Svete
Li Du
Robert Bamler
363
1
0
19 Oct 2023
Recurrent Neural Language Models as Probabilistic Finite-state Automata
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2023
Anej Svete
Robert Bamler
437
4
0
08 Oct 2023
Simplicity Bias in Transformers and their Ability to Learn Sparse Boolean Functions
Annual Meeting of the Association for Computational Linguistics (ACL), 2022
S. Bhattamishra
Arkil Patel
Varun Kanade
Phil Blunsom
379
60
0
22 Nov 2022
Neural Networks and the Chomsky Hierarchy
International Conference on Learning Representations (ICLR), 2022
Grégoire Delétang
Anian Ruoss
Jordi Grau-Moya
Tim Genewein
L. Wenliang
...
Chris Cundy
Marcus Hutter
Shane Legg
Joel Veness
Pedro A. Ortega
UQCV
430
191
0
05 Jul 2022
Learning Bounded Context-Free-Grammar via LSTM and the Transformer:Difference and Explanations
Hui Shi
Sicun Gao
Yuandong Tian
Xinyun Chen
Jishen Zhao
195
25
0
16 Dec 2021
Statistically Meaningful Approximation: a Case Study on Approximating Turing Machines with Transformers
Neural Information Processing Systems (NeurIPS), 2021
Colin Wei
Yining Chen
Tengyu Ma
287
102
0
28 Jul 2021
Learning and Generalization in RNNs
Neural Information Processing Systems (NeurIPS), 2021
A. Panigrahi
Navin Goyal
192
3
0
31 May 2021
Self-Attention Networks Can Process Bounded Hierarchical Languages
Annual Meeting of the Association for Computational Linguistics (ACL), 2021
Shunyu Yao
Binghui Peng
Christos H. Papadimitriou
Karthik Narasimhan
266
96
0
24 May 2021
Stronger Separation of Analog Neuron Hierarchy by Deterministic Context-Free Languages
Neurocomputing (Neurocomputing), 2021
Jirí Síma
110
3
0
02 Feb 2021
Synthesizing Context-free Grammars from Recurrent Neural Networks (Extended Version)
International Conference on Tools and Algorithms for Construction and Analysis of Systems (TACAS), 2021
D. Yellin
Gail Weiss
GNN
216
10
0
20 Jan 2021
On the Practical Ability of Recurrent Neural Networks to Recognize Hierarchical Languages
S. Bhattamishra
Kabir Ahuja
Navin Goyal
ReLM
176
13
0
08 Nov 2020
Evaluating Attribution Methods using White-Box LSTMs
BlackboxNLP Workshop on Analyzing and Interpreting Neural Networks for NLP (BlackboxNLP), 2020
Sophie Hao
FAtt
XAI
177
8
0
16 Oct 2020
RNNs can generate bounded hierarchical languages with optimal memory
John Hewitt
Michael Hahn
Surya Ganguli
Abigail Z. Jacobs
Christopher D. Manning
LRM
195
57
0
15 Oct 2020
On the Ability and Limitations of Transformers to Recognize Formal Languages
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2020
S. Bhattamishra
Kabir Ahuja
Navin Goyal
167
11
0
23 Sep 2020
On the Computational Power of Transformers and its Implications in Sequence Modeling
S. Bhattamishra
Arkil Patel
Navin Goyal
359
81
0
16 Jun 2020
A provably stable neural network Turing Machine
J. Stogin
A. Mali
L. Giles
200
6
0
05 Jun 2020
Recognizing Long Grammatical Sequences Using Recurrent Networks Augmented With An External Differentiable Stack
International Conference on Graphics and Interaction (GI), 2020
A. Mali
Alexander Ororbia
Daniel Kifer
C. Lee Giles
155
14
0
04 Apr 2020
Distance and Equivalence between Finite State Machines and Recurrent Neural Networks: Computational results
Reda Marzouk
C. D. L. Higuera
177
7
0
01 Apr 2020
The Neural State Pushdown Automata
A. Mali
Alexander Ororbia
C. Lee Giles
179
21
0
07 Sep 2019
Theoretical Limitations of Self-Attention in Neural Sequence Models
Transactions of the Association for Computational Linguistics (TACL), 2019
Michael Hahn
272
333
0
16 Jun 2019
1