Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2310.03817
Cited By
Logical Languages Accepted by Transformer Encoders with Hard Attention
5 October 2023
Pablo Barceló
A. Kozachinskiy
A. W. Lin
Vladimir Podolskii
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Logical Languages Accepted by Transformer Encoders with Hard Attention"
3 / 3 papers shown
Title
Rosetta-PL: Propositional Logic as a Benchmark for Large Language Model Reasoning
Shaun Baek
Shaun Esua-Mensah
Cyrus Tsui
Sejan Vigneswaralingam
Abdullah Alali
Michael Lu
Vasu Sharma
Sean O'Brien
Kevin Zhu
LRM
51
0
0
25 Mar 2025
Lower Bounds for Chain-of-Thought Reasoning in Hard-Attention Transformers
Alireza Amiri
Xinting Huang
Mark Rofin
Michael Hahn
LRM
102
0
0
04 Feb 2025
Counting Like Transformers: Compiling Temporal Counting Logic Into Softmax Transformers
Andy Yang
David Chiang
31
7
0
05 Apr 2024
1