ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2404.01139
  4. Cited By
Structured Initialization for Attention in Vision Transformers

Structured Initialization for Attention in Vision Transformers

1 April 2024
Jianqiao Zheng
Xueqian Li
Simon Lucey
    ViT
ArXiv (abs)PDFHTML

Papers citing "Structured Initialization for Attention in Vision Transformers"

1 / 1 papers shown
Title
On the Relationship between Self-Attention and Convolutional Layers
On the Relationship between Self-Attention and Convolutional LayersInternational Conference on Learning Representations (ICLR), 2019
Jean-Baptiste Cordonnier
Andreas Loukas
Martin Jaggi
521
601
0
08 Nov 2019
1