ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2408.01367
  4. Cited By
Transformers are Universal In-context Learners

Transformers are Universal In-context Learners

2 August 2024
Takashi Furuya
Maarten V. de Hoop
Gabriel Peyré
ArXivPDFHTML

Papers citing "Transformers are Universal In-context Learners"

6 / 6 papers shown
Title
DDEQs: Distributional Deep Equilibrium Models through Wasserstein Gradient Flows
DDEQs: Distributional Deep Equilibrium Models through Wasserstein Gradient Flows
Jonathan Geuter
Clément Bonet
Anna Korba
David Alvarez-Melis
56
0
0
03 Mar 2025
Towards Understanding the Universality of Transformers for Next-Token Prediction
Towards Understanding the Universality of Transformers for Next-Token Prediction
Michael E. Sander
Gabriel Peyré
CML
29
0
0
03 Oct 2024
How do Transformers perform In-Context Autoregressive Learning?
How do Transformers perform In-Context Autoregressive Learning?
Michael E. Sander
Raja Giryes
Taiji Suzuki
Mathieu Blondel
Gabriel Peyré
16
7
0
08 Feb 2024
Small Transformers Compute Universal Metric Embeddings
Small Transformers Compute Universal Metric Embeddings
Anastasis Kratsios
Valentin Debarnot
Ivan Dokmanić
52
11
0
14 Sep 2022
Your Transformer May Not be as Powerful as You Expect
Your Transformer May Not be as Powerful as You Expect
Shengjie Luo
Shanda Li
Shuxin Zheng
Tie-Yan Liu
Liwei Wang
Di He
52
50
0
26 May 2022
PointNet: Deep Learning on Point Sets for 3D Classification and
  Segmentation
PointNet: Deep Learning on Point Sets for 3D Classification and Segmentation
C. Qi
Hao Su
Kaichun Mo
Leonidas J. Guibas
3DH
3DPC
3DV
PINN
219
13,886
0
02 Dec 2016
1