Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2307.03493
Cited By
ITA: An Energy-Efficient Attention and Softmax Accelerator for Quantized Transformers
7 July 2023
Gamze Islamoglu
Moritz Scherer
G. Paulin
Tim Fischer
Victor J. B. Jung
Angelo Garofalo
Luca Benini
MQ
Re-assign community
ArXiv
PDF
HTML
Papers citing
"ITA: An Energy-Efficient Attention and Softmax Accelerator for Quantized Transformers"
2 / 2 papers shown
Title
Is Space-Time Attention All You Need for Video Understanding?
Gedas Bertasius
Heng Wang
Lorenzo Torresani
ViT
278
1,981
0
09 Feb 2021
I-BERT: Integer-only BERT Quantization
Sehoon Kim
A. Gholami
Z. Yao
Michael W. Mahoney
Kurt Keutzer
MQ
86
340
0
05 Jan 2021
1