ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2009.08169
  4. Cited By
Holistic Filter Pruning for Efficient Deep Neural Networks

Holistic Filter Pruning for Efficient Deep Neural Networks

17 September 2020
Lukas Enderich
Fabian Timm
Wolfram Burgard
ArXivPDFHTML

Papers citing "Holistic Filter Pruning for Efficient Deep Neural Networks"

5 / 5 papers shown
Title
HESSO: Towards Automatic Efficient and User Friendly Any Neural Network Training and Pruning
HESSO: Towards Automatic Efficient and User Friendly Any Neural Network Training and Pruning
Tianyi Chen
Xiaoyi Qu
David Aponte
Colby R. Banbury
Jongwoo Ko
Tianyu Ding
Yong Ma
Vladimir Lyapunov
Ilya Zharkov
Luming Liang
75
1
0
11 Sep 2024
Deep Neural Networks pruning via the Structured Perspective
  Regularization
Deep Neural Networks pruning via the Structured Perspective Regularization
M. Cacciola
A. Frangioni
Xinlin Li
Andrea Lodi
3DPC
28
5
0
28 Jun 2022
Neural Network Pruning Through Constrained Reinforcement Learning
Neural Network Pruning Through Constrained Reinforcement Learning
Shehryar Malik
Muhammad Umair Haider
O. Iqbal
M. Taj
24
0
0
16 Oct 2021
Group Sparsity: The Hinge Between Filter Pruning and Decomposition for
  Network Compression
Group Sparsity: The Hinge Between Filter Pruning and Decomposition for Network Compression
Yawei Li
Shuhang Gu
Christoph Mayer
Luc Van Gool
Radu Timofte
123
189
0
19 Mar 2020
Incremental Network Quantization: Towards Lossless CNNs with
  Low-Precision Weights
Incremental Network Quantization: Towards Lossless CNNs with Low-Precision Weights
Aojun Zhou
Anbang Yao
Yiwen Guo
Lin Xu
Yurong Chen
MQ
316
1,047
0
10 Feb 2017
1