ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2412.20195
  4. Cited By
Lower bounds on transformers with infinite precision

Lower bounds on transformers with infinite precision

31 December 2024
Alexander Kozachinskiy
ArXivPDFHTML

Papers citing "Lower bounds on transformers with infinite precision"

2 / 2 papers shown
Title
Lower Bounds for Chain-of-Thought Reasoning in Hard-Attention Transformers
Lower Bounds for Chain-of-Thought Reasoning in Hard-Attention Transformers
Alireza Amiri
Xinting Huang
Mark Rofin
Michael Hahn
LRM
125
0
0
04 Feb 2025
Strassen Attention: Unlocking Compositional Abilities in Transformers Based on a New Lower Bound Method
Strassen Attention: Unlocking Compositional Abilities in Transformers Based on a New Lower Bound Method
A. Kozachinskiy
Felipe Urrutia
Hector Jimenez
Tomasz Steifer
Germán Pizarro
Matías Fuentes
Francisco Meza
Cristian Buc
Cristóbal Rojas
47
1
0
31 Jan 2025
1