ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2307.02301
  4. Cited By
Sumformer: Universal Approximation for Efficient Transformers

Sumformer: Universal Approximation for Efficient Transformers

5 July 2023
Silas Alberti
Niclas Dern
L. Thesing
Gitta Kutyniok
ArXivPDFHTML

Papers citing "Sumformer: Universal Approximation for Efficient Transformers"

3 / 3 papers shown
Title
On the Theoretical Expressive Power and the Design Space of Higher-Order
  Graph Transformers
On the Theoretical Expressive Power and the Design Space of Higher-Order Graph Transformers
Cai Zhou
Rose Yu
Yusu Wang
32
7
0
04 Apr 2024
DPOT: Auto-Regressive Denoising Operator Transformer for Large-Scale PDE
  Pre-Training
DPOT: Auto-Regressive Denoising Operator Transformer for Large-Scale PDE Pre-Training
Zhongkai Hao
Chang Su
Songming Liu
Julius Berner
Chengyang Ying
Hang Su
A. Anandkumar
Jian Song
Jun Zhu
AI4TS
AI4CE
22
21
0
06 Mar 2024
Big Bird: Transformers for Longer Sequences
Big Bird: Transformers for Longer Sequences
Manzil Zaheer
Guru Guruganesh
Kumar Avinava Dubey
Joshua Ainslie
Chris Alberti
...
Philip Pham
Anirudh Ravula
Qifan Wang
Li Yang
Amr Ahmed
VLM
249
2,009
0
28 Jul 2020
1