ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2110.03303
  4. Cited By
Universal Approximation Under Constraints is Possible with Transformers

Universal Approximation Under Constraints is Possible with Transformers

7 October 2021
Anastasis Kratsios
Behnoosh Zamanlooy
Tianlin Liu
Ivan Dokmanić
ArXivPDFHTML

Papers citing "Universal Approximation Under Constraints is Possible with Transformers"

5 / 5 papers shown
Title
Approximation Rate of the Transformer Architecture for Sequence Modeling
Approximation Rate of the Transformer Architecture for Sequence Modeling
Hao Jiang
Qianxiao Li
44
9
0
03 Jan 2025
TL-PCA: Transfer Learning of Principal Component Analysis
TL-PCA: Transfer Learning of Principal Component Analysis
Sharon Hendy
Yehuda Dar
65
1
0
14 Oct 2024
Are Transformers with One Layer Self-Attention Using Low-Rank Weight
  Matrices Universal Approximators?
Are Transformers with One Layer Self-Attention Using Low-Rank Weight Matrices Universal Approximators?
T. Kajitsuka
Issei Sato
26
16
0
26 Jul 2023
Geometric Deep Learning: Grids, Groups, Graphs, Geodesics, and Gauges
Geometric Deep Learning: Grids, Groups, Graphs, Geodesics, and Gauges
M. Bronstein
Joan Bruna
Taco S. Cohen
Petar Velivcković
GNN
161
1,095
0
27 Apr 2021
Geometric deep learning: going beyond Euclidean data
Geometric deep learning: going beyond Euclidean data
M. Bronstein
Joan Bruna
Yann LeCun
Arthur Szlam
P. Vandergheynst
GNN
228
3,202
0
24 Nov 2016
1