ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1812.10240
  4. Cited By
Studying the Plasticity in Deep Convolutional Neural Networks using
  Random Pruning

Studying the Plasticity in Deep Convolutional Neural Networks using Random Pruning

26 December 2018
Deepak Mittal
S. Bhardwaj
Mitesh M. Khapra
Balaraman Ravindran
    3DPC
ArXivPDFHTML

Papers citing "Studying the Plasticity in Deep Convolutional Neural Networks using Random Pruning"

4 / 4 papers shown
Title
Outlier Weighed Layerwise Sparsity (OWL): A Missing Secret Sauce for
  Pruning LLMs to High Sparsity
Outlier Weighed Layerwise Sparsity (OWL): A Missing Secret Sauce for Pruning LLMs to High Sparsity
Lu Yin
You Wu
Zhenyu (Allen) Zhang
Cheng-Yu Hsieh
Yaqing Wang
...
Mykola Pechenizkiy
Yi Liang
Michael Bendersky
Zhangyang Wang
Shiwei Liu
28
78
0
08 Oct 2023
Sparsity May Cry: Let Us Fail (Current) Sparse Neural Networks Together!
Sparsity May Cry: Let Us Fail (Current) Sparse Neural Networks Together!
Shiwei Liu
Tianlong Chen
Zhenyu (Allen) Zhang
Xuxi Chen
Tianjin Huang
Ajay Jaiswal
Zhangyang Wang
29
29
0
03 Mar 2023
Complexity-Driven CNN Compression for Resource-constrained Edge AI
Complexity-Driven CNN Compression for Resource-constrained Edge AI
Muhammad Zawish
Steven Davy
L. Abraham
33
16
0
26 Aug 2022
Incremental Network Quantization: Towards Lossless CNNs with
  Low-Precision Weights
Incremental Network Quantization: Towards Lossless CNNs with Low-Precision Weights
Aojun Zhou
Anbang Yao
Yiwen Guo
Lin Xu
Yurong Chen
MQ
322
1,049
0
10 Feb 2017
1