ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2205.02321
  4. Cited By
Most Activation Functions Can Win the Lottery Without Excessive Depth

Most Activation Functions Can Win the Lottery Without Excessive Depth

4 May 2022
R. Burkholz
    MLT
ArXivPDFHTML

Papers citing "Most Activation Functions Can Win the Lottery Without Excessive Depth"

15 / 15 papers shown
Title
Sign-In to the Lottery: Reparameterizing Sparse Training From Scratch
Sign-In to the Lottery: Reparameterizing Sparse Training From Scratch
Advait Gadhikar
Tom Jacobs
Chao Zhou
R. Burkholz
27
0
0
17 Apr 2025
On the Surprising Effectiveness of Attention Transfer for Vision
  Transformers
On the Surprising Effectiveness of Attention Transfer for Vision Transformers
Alexander C. Li
Yuandong Tian
B. Chen
Deepak Pathak
Xinlei Chen
29
0
0
14 Nov 2024
Mask in the Mirror: Implicit Sparsification
Mask in the Mirror: Implicit Sparsification
Tom Jacobs
R. Burkholz
40
3
0
19 Aug 2024
Cyclic Sparse Training: Is it Enough?
Cyclic Sparse Training: Is it Enough?
Advait Gadhikar
Sree Harsha Nelaturu
R. Burkholz
CLL
38
0
0
04 Jun 2024
Spectral Graph Pruning Against Over-Squashing and Over-Smoothing
Spectral Graph Pruning Against Over-Squashing and Over-Smoothing
Adarsh Jamadandi
Celia Rubio-Madrigal
R. Burkholz
30
1
0
06 Apr 2024
A Survey of Lottery Ticket Hypothesis
A Survey of Lottery Ticket Hypothesis
Bohan Liu
Zijie Zhang
Peixiong He
Zhensen Wang
Yang Xiao
Ruimeng Ye
Yang Zhou
Wei-Shinn Ku
Bo Hui
UQCV
32
12
0
07 Mar 2024
Masks, Signs, And Learning Rate Rewinding
Masks, Signs, And Learning Rate Rewinding
Advait Gadhikar
R. Burkholz
50
8
0
29 Feb 2024
Polynomially Over-Parameterized Convolutional Neural Networks Contain
  Structured Strong Winning Lottery Tickets
Polynomially Over-Parameterized Convolutional Neural Networks Contain Structured Strong Winning Lottery Tickets
A. D. Cunha
Francesco d’Amore
Emanuele Natale
MLT
19
1
0
16 Nov 2023
Quantifying lottery tickets under label noise: accuracy, calibration,
  and complexity
Quantifying lottery tickets under label noise: accuracy, calibration, and complexity
V. Arora
Daniele Irto
Sebastian Goldt
G. Sanguinetti
24
2
0
21 Jun 2023
Workload-Balanced Pruning for Sparse Spiking Neural Networks
Workload-Balanced Pruning for Sparse Spiking Neural Networks
Ruokai Yin
Youngeun Kim
Yuhang Li
Abhishek Moitra
Nitin Satpute
Anna Hambitzer
Priyadarshini Panda
23
18
0
13 Feb 2023
Why Random Pruning Is All We Need to Start Sparse
Why Random Pruning Is All We Need to Start Sparse
Advait Gadhikar
Sohom Mukherjee
R. Burkholz
25
19
0
05 Oct 2022
Dynamical Isometry for Residual Networks
Dynamical Isometry for Residual Networks
Advait Gadhikar
R. Burkholz
ODL
AI4CE
11
2
0
05 Oct 2022
A General Framework For Proving The Equivariant Strong Lottery Ticket
  Hypothesis
A General Framework For Proving The Equivariant Strong Lottery Ticket Hypothesis
Damien Ferbach
Christos Tsirigotis
Gauthier Gidel
Avishek
A. Bose
14
16
0
09 Jun 2022
Convolutional and Residual Networks Provably Contain Lottery Tickets
Convolutional and Residual Networks Provably Contain Lottery Tickets
R. Burkholz
UQCV
MLT
22
13
0
04 May 2022
Comparing Rewinding and Fine-tuning in Neural Network Pruning
Comparing Rewinding and Fine-tuning in Neural Network Pruning
Alex Renda
Jonathan Frankle
Michael Carbin
222
382
0
05 Mar 2020
1