Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1809.02836
Cited By
Context-Free Transductions with Neural Stacks
8 September 2018
Sophie Hao
William Merrill
Dana Angluin
Robert Frank
Noah Amsel
Andrew Benz
S. Mendelsohn
GNN
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Context-Free Transductions with Neural Stacks"
8 / 8 papers shown
Title
On the Representational Capacity of Neural Language Models with Chain-of-Thought Reasoning
Franz Nowak
Anej Svete
Alexandra Butoi
Ryan Cotterell
ReLM
LRM
54
13
0
20 Jun 2024
What Languages are Easy to Language-Model? A Perspective from Learning Probabilistic Regular Languages
Nadav Borenstein
Anej Svete
R. Chan
Josef Valvoda
Franz Nowak
Isabelle Augenstein
Eleanor Chodroff
Ryan Cotterell
42
12
0
06 Jun 2024
Recurrent Neural Language Models as Probabilistic Finite-state Automata
Anej Svete
Ryan Cotterell
42
2
0
08 Oct 2023
Neural Networks and the Chomsky Hierarchy
Grégoire Delétang
Anian Ruoss
Jordi Grau-Moya
Tim Genewein
L. Wenliang
...
Chris Cundy
Marcus Hutter
Shane Legg
Joel Veness
Pedro A. Ortega
UQCV
107
131
0
05 Jul 2022
Formal Language Theory Meets Modern NLP
William Merrill
AI4CE
NAI
21
12
0
19 Feb 2021
Memory-Augmented Recurrent Neural Networks Can Learn Generalized Dyck Languages
Mirac Suzgun
Sebastian Gehrmann
Yonatan Belinkov
Stuart M. Shieber
29
50
0
08 Nov 2019
Sequential Neural Networks as Automata
William Merrill
23
74
0
04 Jun 2019
Finding Syntactic Representations in Neural Stacks
William Merrill
Lenny Khazan
Noah Amsel
Sophie Hao
S. Mendelsohn
Robert Frank
19
6
0
04 Jun 2019
1