ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2307.03493
  4. Cited By
ITA: An Energy-Efficient Attention and Softmax Accelerator for Quantized
  Transformers

ITA: An Energy-Efficient Attention and Softmax Accelerator for Quantized Transformers

7 July 2023
Gamze Islamoglu
Moritz Scherer
G. Paulin
Tim Fischer
Victor J. B. Jung
Angelo Garofalo
Luca Benini
    MQ
ArXivPDFHTML

Papers citing "ITA: An Energy-Efficient Attention and Softmax Accelerator for Quantized Transformers"

2 / 2 papers shown
Title
Is Space-Time Attention All You Need for Video Understanding?
Is Space-Time Attention All You Need for Video Understanding?
Gedas Bertasius
Heng Wang
Lorenzo Torresani
ViT
278
1,981
0
09 Feb 2021
I-BERT: Integer-only BERT Quantization
I-BERT: Integer-only BERT Quantization
Sehoon Kim
A. Gholami
Z. Yao
Michael W. Mahoney
Kurt Keutzer
MQ
86
340
0
05 Jan 2021
1