ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.03817
  4. Cited By
Logical Languages Accepted by Transformer Encoders with Hard Attention

Logical Languages Accepted by Transformer Encoders with Hard Attention

5 October 2023
Pablo Barceló
A. Kozachinskiy
A. W. Lin
Vladimir Podolskii
ArXivPDFHTML

Papers citing "Logical Languages Accepted by Transformer Encoders with Hard Attention"

3 / 3 papers shown
Title
Rosetta-PL: Propositional Logic as a Benchmark for Large Language Model Reasoning
Rosetta-PL: Propositional Logic as a Benchmark for Large Language Model Reasoning
Shaun Baek
Shaun Esua-Mensah
Cyrus Tsui
Sejan Vigneswaralingam
Abdullah Alali
Michael Lu
Vasu Sharma
Sean O'Brien
Kevin Zhu
LRM
51
0
0
25 Mar 2025
Lower Bounds for Chain-of-Thought Reasoning in Hard-Attention Transformers
Lower Bounds for Chain-of-Thought Reasoning in Hard-Attention Transformers
Alireza Amiri
Xinting Huang
Mark Rofin
Michael Hahn
LRM
102
0
0
04 Feb 2025
Counting Like Transformers: Compiling Temporal Counting Logic Into
  Softmax Transformers
Counting Like Transformers: Compiling Temporal Counting Logic Into Softmax Transformers
Andy Yang
David Chiang
31
7
0
05 Apr 2024
1