ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2501.19215
  4. Cited By
Strassen Attention: Unlocking Compositional Abilities in Transformers Based on a New Lower Bound Method

Strassen Attention: Unlocking Compositional Abilities in Transformers Based on a New Lower Bound Method

31 January 2025
Alexander Kozachinskiy
Felipe Urrutia
Hector Jimenez
Tomasz Steifer
Germán Pizarro
Matías Fuentes
Francisco Meza
Cristian Buc
Cristóbal Rojas
ArXivPDFHTML

Papers citing "Strassen Attention: Unlocking Compositional Abilities in Transformers Based on a New Lower Bound Method"

1 / 1 papers shown
Title
Concise One-Layer Transformers Can Do Function Evaluation (Sometimes)
Concise One-Layer Transformers Can Do Function Evaluation (Sometimes)
Lena Strobl
Dana Angluin
Robert Frank
38
0
0
28 Mar 2025
1