Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
1812.08119
Cited By
Adam Induces Implicit Weight Sparsity in Rectifier Neural Networks
19 December 2018
A. Yaguchi
Taiji Suzuki
Wataru Asano
Shuhei Nitta
Y. Sakata
A. Tanizawa
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Adam Induces Implicit Weight Sparsity in Rectifier Neural Networks"
7 / 7 papers shown
Spiking Synaptic Penalty: Appropriate Penalty Term for Energy-Efficient Spiking Neural Networks
Kazuma Suetake
Takuya Ushimaru
Ryuji Saiin
Yoshihide Sawada
274
0
0
03 Feb 2023
The Impact of Activation Sparsity on Overfitting in Convolutional Neural Networks
Karim Huesmann
Luis Garcia Rodriguez
Lars Linsen
Benjamin Risse
154
4
0
13 Apr 2021
Schizophrenia-mimicking layers outperform conventional neural network layers
Frontiers in Neurorobotics (FN), 2020
R. Mizutani
Senta Noguchi
R. Saiga
Yuichi Yamashita
M. Miyashita
Makoto Arai
M. Itokawa
167
5
0
23 Sep 2020
Exploiting the Full Capacity of Deep Neural Networks while Avoiding Overfitting by Targeted Sparsity Regularization
Karim Huesmann
Soeren Klemm
Lars Linsen
Benjamin Risse
115
2
0
21 Feb 2020
How Does BN Increase Collapsed Neural Network Filters?
Sheng Zhou
Xinjiang Wang
Ping Luo
Xue Jiang
Wenjie Li
Wei Zhang
175
1
0
30 Jan 2020
Understanding the Effects of Pre-Training for Object Detectors via Eigenspectrum
Yosuke Shinya
E. Simo-Serra
Taiji Suzuki
155
13
0
09 Sep 2019
On Implicit Filter Level Sparsity in Convolutional Neural Networks
Dushyant Mehta
K. Kim
Christian Theobalt
179
29
0
29 Nov 2018
1
Page 1 of 1