ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2103.10452
  4. Cited By
Extending Sparse Tensor Accelerators to Support Multiple Compression
  Formats

Extending Sparse Tensor Accelerators to Support Multiple Compression Formats

18 March 2021
Eric Qin
Geonhwa Jeong
William Won
Sheng-Chun Kao
Hyoukjun Kwon
S. Srinivasan
Dipankar Das
G. Moon
S. Rajamanickam
T. Krishna
ArXivPDFHTML

Papers citing "Extending Sparse Tensor Accelerators to Support Multiple Compression Formats"

4 / 4 papers shown
Title
FLAASH: Flexible Accelerator Architecture for Sparse High-Order Tensor
  Contraction
FLAASH: Flexible Accelerator Architecture for Sparse High-Order Tensor Contraction
Gabriel Kulp
Andrew Ensinger
Lizhong Chen
24
2
0
25 Apr 2024
Progressive Gradient Flow for Robust N:M Sparsity Training in
  Transformers
Progressive Gradient Flow for Robust N:M Sparsity Training in Transformers
A. Bambhaniya
Amir Yazdanbakhsh
Suvinay Subramanian
Sheng-Chun Kao
Shivani Agrawal
Utku Evci
Tushar Krishna
54
16
0
07 Feb 2024
Training Recipe for N:M Structured Sparsity with Decaying Pruning Mask
Training Recipe for N:M Structured Sparsity with Decaying Pruning Mask
Sheng-Chun Kao
Amir Yazdanbakhsh
Suvinay Subramanian
Shivani Agrawal
Utku Evci
T. Krishna
48
12
0
15 Sep 2022
A Systematic Survey of General Sparse Matrix-Matrix Multiplication
A Systematic Survey of General Sparse Matrix-Matrix Multiplication
Jianhua Gao
Weixing Ji
Fangli Chang
Zhaonian Tan
Bingxin Wei
Zeming Liu
Yueyan Zhao
18
57
0
26 Feb 2020
1