ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2409.03460
  4. Cited By
LowFormer: Hardware Efficient Design for Convolutional Transformer
  Backbones

LowFormer: Hardware Efficient Design for Convolutional Transformer Backbones

5 September 2024
Moritz Nottebaum
Matteo Dunnhofer
C. Micheloni
    ViT
ArXivPDFHTML

Papers citing "LowFormer: Hardware Efficient Design for Convolutional Transformer Backbones"

3 / 3 papers shown
Title
iFormer: Integrating ConvNet and Transformer for Mobile Application
iFormer: Integrating ConvNet and Transformer for Mobile Application
Chuanyang Zheng
ViT
67
0
0
26 Jan 2025
SHViT: Single-Head Vision Transformer with Memory Efficient Macro Design
SHViT: Single-Head Vision Transformer with Memory Efficient Macro Design
Seokju Yun
Youngmin Ro
ViT
26
29
0
29 Jan 2024
Hydra Attention: Efficient Attention with Many Heads
Hydra Attention: Efficient Attention with Many Heads
Daniel Bolya
Cheng-Yang Fu
Xiaoliang Dai
Peizhao Zhang
Judy Hoffman
99
75
0
15 Sep 2022
1