Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2006.09081
Cited By
Progressive Skeletonization: Trimming more fat from a network at initialization
16 June 2020
Pau de Jorge
Amartya Sanyal
Harkirat Singh Behl
Philip H. S. Torr
Grégory Rogez
P. Dokania
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Progressive Skeletonization: Trimming more fat from a network at initialization"
10 / 60 papers shown
Title
Signing the Supermask: Keep, Hide, Invert
Nils Koster
O. Grothe
Achim Rettinger
23
10
0
31 Jan 2022
When to Prune? A Policy towards Early Structural Pruning
Maying Shen
Pavlo Molchanov
Hongxu Yin
J. Álvarez
VLM
20
52
0
22 Oct 2021
HALP: Hardware-Aware Latency Pruning
Maying Shen
Hongxu Yin
Pavlo Molchanov
Lei Mao
Jianna Liu
J. Álvarez
VLM
30
13
0
20 Oct 2021
Connectivity Matters: Neural Network Pruning Through the Lens of Effective Sparsity
Artem Vysogorets
Julia Kempe
13
19
0
05 Jul 2021
Sparse Training via Boosting Pruning Plasticity with Neuroregeneration
Shiwei Liu
Tianlong Chen
Xiaohan Chen
Zahra Atashgahi
Lu Yin
Huanyu Kou
Li Shen
Mykola Pechenizkiy
Zhangyang Wang
D. Mocanu
34
111
0
19 Jun 2021
Dense for the Price of Sparse: Improved Performance of Sparsely Initialized Networks via a Subspace Offset
Ilan Price
Jared Tanner
27
15
0
12 Feb 2021
Pruning Neural Networks at Initialization: Why are We Missing the Mark?
Jonathan Frankle
Gintare Karolina Dziugaite
Daniel M. Roy
Michael Carbin
6
238
0
18 Sep 2020
Pruning neural networks without any data by iteratively conserving synaptic flow
Hidenori Tanaka
D. Kunin
Daniel L. K. Yamins
Surya Ganguli
8
628
0
09 Jun 2020
A Brain-inspired Algorithm for Training Highly Sparse Neural Networks
Zahra Atashgahi
Joost Pieterse
Shiwei Liu
D. Mocanu
Raymond N. J. Veldhuis
Mykola Pechenizkiy
21
15
0
17 Mar 2019
Neural Architecture Search with Reinforcement Learning
Barret Zoph
Quoc V. Le
264
5,326
0
05 Nov 2016
Previous
1
2