ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2102.07071
  4. Cited By
Doping: A technique for efficient compression of LSTM models using
  sparse structured additive matrices

Doping: A technique for efficient compression of LSTM models using sparse structured additive matrices

Conference on Machine Learning and Systems (MLSys), 2021
14 February 2021
Urmish Thakker
P. Whatmough
Zhi-Gang Liu
Matthew Mattina
Jesse G. Beu
ArXiv (abs)PDFHTML

Papers citing "Doping: A technique for efficient compression of LSTM models using sparse structured additive matrices"

4 / 4 papers shown
Title
Fast Kronecker Matrix-Matrix Multiplication on GPUs
Fast Kronecker Matrix-Matrix Multiplication on GPUs
Abhinav Jangda
Mohit Yadav
249
4
0
18 Jan 2024
Sparse-IFT: Sparse Iso-FLOP Transformations for Maximizing Training
  Efficiency
Sparse-IFT: Sparse Iso-FLOP Transformations for Maximizing Training EfficiencyInternational Conference on Machine Learning (ICML), 2023
Vithursan Thangarasa
Shreyas Saxena
Abhay Gupta
Sean Lie
404
6
0
21 Mar 2023
Machine Learning for Microcontroller-Class Hardware: A Review
Machine Learning for Microcontroller-Class Hardware: A ReviewIEEE Sensors Journal (IEEE Sens. J.), 2022
Swapnil Sayan Saha
S. Sandha
Mani B. Srivastava
427
168
0
29 May 2022
S2TA: Exploiting Structured Sparsity for Energy-Efficient Mobile CNN
  Acceleration
S2TA: Exploiting Structured Sparsity for Energy-Efficient Mobile CNN AccelerationInternational Symposium on High-Performance Computer Architecture (HPCA), 2021
Zhi-Gang Liu
P. Whatmough
Yuhao Zhu
Matthew Mattina
MQ
170
99
0
16 Jul 2021
1