ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.08049
  4. Cited By
Is attention required for ICL? Exploring the Relationship Between Model
  Architecture and In-Context Learning Ability
v1v2v3 (latest)

Is attention required for ICL? Exploring the Relationship Between Model Architecture and In-Context Learning Ability

International Conference on Learning Representations (ICLR), 2023
12 October 2023
Ivan Lee
Nan Jiang
Taylor Berg-Kirkpatrick
ArXiv (abs)PDFHTMLGithub

Papers citing "Is attention required for ICL? Exploring the Relationship Between Model Architecture and In-Context Learning Ability"

4 / 4 papers shown
Weight-Space Linear Recurrent Neural Networks
Weight-Space Linear Recurrent Neural Networks
Roussel Desmond Nzoyem
Nawid Keshtmand
Enrique Crespo Fernandez
Idriss Tsayem
Raúl Santos-Rodríguez
David A.W. Barton
Tom Deakin
385
3
0
01 Jun 2025
Training Dynamics of In-Context Learning in Linear Attention
Training Dynamics of In-Context Learning in Linear Attention
Yedi Zhang
Aaditya K. Singh
Peter E. Latham
Andrew Saxe
MLT
410
34
0
27 Jan 2025
Fine-grained Analysis of In-context Linear Estimation: Data,
  Architecture, and Beyond
Fine-grained Analysis of In-context Linear Estimation: Data, Architecture, and Beyond
Yingcong Li
A. S. Rawat
Samet Oymak
278
19
0
13 Jul 2024
Is Mamba Capable of In-Context Learning?
Is Mamba Capable of In-Context Learning?
Riccardo Grazzi
Julien N. Siems
Simon Schrodi
Thomas Brox
Frank Hutter
273
63
0
05 Feb 2024
1
Page 1 of 1