Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2404.04393
Cited By
Counting Like Transformers: Compiling Temporal Counting Logic Into Softmax Transformers
5 April 2024
Andy Yang
David Chiang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Counting Like Transformers: Compiling Temporal Counting Logic Into Softmax Transformers"
8 / 8 papers shown
Title
Between Circuits and Chomsky: Pre-pretraining on Formal Languages Imparts Linguistic Biases
Michael Y. Hu
Jackson Petty
Chuan Shi
William Merrill
Tal Linzen
AI4CE
62
1
0
26 Feb 2025
Ehrenfeucht-Haussler Rank and Chain of Thought
Pablo Barceló
A. Kozachinskiy
Tomasz Steifer
LRM
71
1
0
22 Jan 2025
Transformers in Uniform TC
0
^0
0
David Chiang
20
3
0
20 Sep 2024
Language Models Need Inductive Biases to Count Inductively
Yingshan Chang
Yonatan Bisk
LRM
32
5
0
30 May 2024
The Expressive Capacity of State Space Models: A Formal Language Perspective
Yash Sarrof
Yana Veitsman
Michael Hahn
Mamba
30
7
0
27 May 2024
Masked Hard-Attention Transformers Recognize Exactly the Star-Free Languages
Andy Yang
David Chiang
Dana Angluin
22
14
0
21 Oct 2023
On the Expressivity Role of LayerNorm in Transformers' Attention
Shaked Brody
Shiyu Jin
Xinghao Zhu
MoE
56
30
0
04 May 2023
A Logic for Expressing Log-Precision Transformers
William Merrill
Ashish Sabharwal
ReLM
NAI
LRM
45
46
0
06 Oct 2022
1