ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1812.04368
  4. Cited By
Exploiting Kernel Sparsity and Entropy for Interpretable CNN Compression

Exploiting Kernel Sparsity and Entropy for Interpretable CNN Compression

11 December 2018
Yuchao Li
Shaohui Lin
Baochang Zhang
Jianzhuang Liu
David Doermann
Yongjian Wu
Feiyue Huang
R. Ji
ArXivPDFHTML

Papers citing "Exploiting Kernel Sparsity and Entropy for Interpretable CNN Compression"

12 / 12 papers shown
Title
Surrogate Lagrangian Relaxation: A Path To Retrain-free Deep Neural
  Network Pruning
Surrogate Lagrangian Relaxation: A Path To Retrain-free Deep Neural Network Pruning
Shangli Zhou
Mikhail A. Bragin
Lynn Pepin
Deniz Gurevin
Fei Miao
Caiwen Ding
14
3
0
08 Apr 2023
Sauron U-Net: Simple automated redundancy elimination in medical image
  segmentation via filter pruning
Sauron U-Net: Simple automated redundancy elimination in medical image segmentation via filter pruning
Juan Miguel Valverde
Artem Shatillo
Jussi Tohka
AAML
26
5
0
27 Sep 2022
Efficient CNN with uncorrelated Bag of Features pooling
Efficient CNN with uncorrelated Bag of Features pooling
Firas Laakom
Jenni Raitoharju
Alexandros Iosifidis
M. Gabbouj
27
2
0
22 Sep 2022
Survey: Exploiting Data Redundancy for Optimization of Deep Learning
Survey: Exploiting Data Redundancy for Optimization of Deep Learning
Jou-An Chen
Wei Niu
Bin Ren
Yanzhi Wang
Xipeng Shen
23
24
0
29 Aug 2022
Revisiting Random Channel Pruning for Neural Network Compression
Revisiting Random Channel Pruning for Neural Network Compression
Yawei Li
Kamil Adamczewski
Wen Li
Shuhang Gu
Radu Timofte
Luc Van Gool
21
81
0
11 May 2022
Class-Discriminative CNN Compression
Class-Discriminative CNN Compression
Yuchen Liu
D. Wentzlaff
S. Kung
26
1
0
21 Oct 2021
Neural Network Compression Via Sparse Optimization
Neural Network Compression Via Sparse Optimization
Tianyi Chen
Bo Ji
Yixin Shi
Tianyu Ding
Biyi Fang
Sheng Yi
Xiao Tu
30
15
0
10 Nov 2020
Self-grouping Convolutional Neural Networks
Self-grouping Convolutional Neural Networks
Qingbei Guo
Xiaojun Wu
J. Kittler
Zhiquan Feng
25
22
0
29 Sep 2020
Dynamic Group Convolution for Accelerating Convolutional Neural Networks
Dynamic Group Convolution for Accelerating Convolutional Neural Networks
Z. Su
Linpu Fang
Wenxiong Kang
D. Hu
M. Pietikäinen
Li Liu
6
44
0
08 Jul 2020
Group Sparsity: The Hinge Between Filter Pruning and Decomposition for
  Network Compression
Group Sparsity: The Hinge Between Filter Pruning and Decomposition for Network Compression
Yawei Li
Shuhang Gu
Christoph Mayer
Luc Van Gool
Radu Timofte
123
189
0
19 Mar 2020
Learning Filter Basis for Convolutional Neural Network Compression
Learning Filter Basis for Convolutional Neural Network Compression
Yawei Li
Shuhang Gu
Luc Van Gool
Radu Timofte
SupR
14
97
0
23 Aug 2019
Revisiting the Importance of Individual Units in CNNs via Ablation
Revisiting the Importance of Individual Units in CNNs via Ablation
Bolei Zhou
Yiyou Sun
David Bau
Antonio Torralba
FAtt
56
116
0
07 Jun 2018
1