ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.20051
  4. Cited By
The Expressibility of Polynomial based Attention Scheme

The Expressibility of Polynomial based Attention Scheme

30 October 2023
Zhao-quan Song
Guangyi Xu
Junze Yin
ArXivPDFHTML

Papers citing "The Expressibility of Polynomial based Attention Scheme"

5 / 5 papers shown
Title
Fast Gradient Computation for RoPE Attention in Almost Linear Time
Fast Gradient Computation for RoPE Attention in Almost Linear Time
Yifang Chen
Jiayan Huo
Xiaoyu Li
Yingyu Liang
Zhenmei Shi
Zhao-quan Song
48
11
0
03 Jan 2025
LLMs as Potential Brainstorming Partners for Math and Science Problems
LLMs as Potential Brainstorming Partners for Math and Science Problems
Sophia Gu
AI4CE
LRM
46
5
0
10 Oct 2023
Who's Harry Potter? Approximate Unlearning in LLMs
Who's Harry Potter? Approximate Unlearning in LLMs
Ronen Eldan
M. Russinovich
MU
MoMe
101
171
0
03 Oct 2023
Do Transformers Parse while Predicting the Masked Word?
Do Transformers Parse while Predicting the Masked Word?
Haoyu Zhao
A. Panigrahi
Rong Ge
Sanjeev Arora
74
29
0
14 Mar 2023
Dynamic Tensor Product Regression
Dynamic Tensor Product Regression
Aravind Reddy
Zhao-quan Song
Licheng Zhang
31
20
0
08 Oct 2022
1