ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1804.09862
  4. Cited By
Accelerator-Aware Pruning for Convolutional Neural Networks

Accelerator-Aware Pruning for Convolutional Neural Networks

26 April 2018
Hyeong-Ju Kang
ArXivPDFHTML

Papers citing "Accelerator-Aware Pruning for Convolutional Neural Networks"

5 / 5 papers shown
Title
Effective Interplay between Sparsity and Quantization: From Theory to Practice
Effective Interplay between Sparsity and Quantization: From Theory to Practice
Simla Burcu Harma
Ayan Chakraborty
Elizaveta Kostenok
Danila Mishin
Dongho Ha
...
Martin Jaggi
Ming Liu
Yunho Oh
Suvinay Subramanian
Amir Yazdanbakhsh
MQ
44
5
0
31 May 2024
Accelerator-Aware Training for Transducer-Based Speech Recognition
Accelerator-Aware Training for Transducer-Based Speech Recognition
Suhaila M. Shakiah
R. Swaminathan
Hieu Duy Nguyen
Raviteja Chinta
Tariq Afzal
Nathan Susanj
Athanasios Mouchtaris
Grant P. Strimel
Ariya Rastrow
19
1
0
12 May 2023
DNNShield: Dynamic Randomized Model Sparsification, A Defense Against
  Adversarial Machine Learning
DNNShield: Dynamic Randomized Model Sparsification, A Defense Against Adversarial Machine Learning
Mohammad Hossein Samavatian
Saikat Majumdar
Kristin Barber
R. Teodorescu
AAML
14
2
0
31 Jul 2022
Hardware Acceleration of Sparse and Irregular Tensor Computations of ML
  Models: A Survey and Insights
Hardware Acceleration of Sparse and Irregular Tensor Computations of ML Models: A Survey and Insights
Shail Dave
Riyadh Baghdadi
Tony Nowatzki
Sasikanth Avancha
Aviral Shrivastava
Baoxin Li
46
81
0
02 Jul 2020
SASL: Saliency-Adaptive Sparsity Learning for Neural Network
  Acceleration
SASL: Saliency-Adaptive Sparsity Learning for Neural Network Acceleration
Jun Shi
Jianfeng Xu
K. Tasaka
Zhibo Chen
4
25
0
12 Mar 2020
1