ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1811.12495
  4. Cited By
On Implicit Filter Level Sparsity in Convolutional Neural Networks

On Implicit Filter Level Sparsity in Convolutional Neural Networks

29 November 2018
Dushyant Mehta
K. Kim
Christian Theobalt
ArXivPDFHTML

Papers citing "On Implicit Filter Level Sparsity in Convolutional Neural Networks"

5 / 5 papers shown
Title
Regularization by Misclassification in ReLU Neural Networks
Regularization by Misclassification in ReLU Neural Networks
Elisabetta Cornacchia
Jan Hązła
Ido Nachum
Amir Yehudayoff
NoLa
25
2
0
03 Nov 2021
Neural network relief: a pruning algorithm based on neural activity
Neural network relief: a pruning algorithm based on neural activity
Aleksandr Dekhovich
David Tax
M. Sluiter
Miguel A. Bessa
46
10
0
22 Sep 2021
Understanding the Effects of Pre-Training for Object Detectors via
  Eigenspectrum
Understanding the Effects of Pre-Training for Object Detectors via Eigenspectrum
Yosuke Shinya
E. Simo-Serra
Taiji Suzuki
19
12
0
09 Sep 2019
Revisiting the Importance of Individual Units in CNNs via Ablation
Revisiting the Importance of Individual Units in CNNs via Ablation
Bolei Zhou
Yiyou Sun
David Bau
Antonio Torralba
FAtt
59
116
0
07 Jun 2018
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
308
2,890
0
15 Sep 2016
1