ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.13220
  4. Cited By
In-context Learning with Transformer Is Really Equivalent to a
  Contrastive Learning Pattern

In-context Learning with Transformer Is Really Equivalent to a Contrastive Learning Pattern

20 October 2023
Ruifeng Ren
Yong Liu
ArXiv (abs)PDFHTML

Papers citing "In-context Learning with Transformer Is Really Equivalent to a Contrastive Learning Pattern"

1 / 1 papers shown
Theoretical Insights into Fine-Tuning Attention Mechanism: Generalization and Optimization
Theoretical Insights into Fine-Tuning Attention Mechanism: Generalization and OptimizationInternational Joint Conference on Artificial Intelligence (IJCAI), 2024
Xinhao Yao
Hongjin Qian
Xiaolin Hu
Gengze Xu
Wei Liu
Jian Luan
Bin Wang
Wenshu Fan
435
7
0
03 Oct 2024
1
Page 1 of 1