ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2207.09455
  4. Cited By
To update or not to update? Neurons at equilibrium in deep models

To update or not to update? Neurons at equilibrium in deep models

19 July 2022
Andrea Bragagnolo
Enzo Tartaglione
Marco Grangetto
ArXivPDFHTML

Papers citing "To update or not to update? Neurons at equilibrium in deep models"

6 / 6 papers shown
Title
SCoTTi: Save Computation at Training Time with an adaptive framework
SCoTTi: Save Computation at Training Time with an adaptive framework
Ziyu Li
Enzo Tartaglione
Van-Tam Nguyen
31
0
0
19 Dec 2023
Towards On-device Learning on the Edge: Ways to Select Neurons to Update
  under a Budget Constraint
Towards On-device Learning on the Edge: Ways to Select Neurons to Update under a Budget Constraint
Ael Quélennec
Enzo Tartaglione
Pavlo Mozharovskyi
Van-Tam Nguyen
26
2
0
08 Dec 2023
Can we avoid Double Descent in Deep Neural Networks?
Can we avoid Double Descent in Deep Neural Networks?
Victor Quétu
Enzo Tartaglione
AI4CE
20
3
0
26 Feb 2023
SeReNe: Sensitivity based Regularization of Neurons for Structured
  Sparsity in Neural Networks
SeReNe: Sensitivity based Regularization of Neurons for Structured Sparsity in Neural Networks
Enzo Tartaglione
Andrea Bragagnolo
Francesco Odierna
A. Fiandrotti
Marco Grangetto
38
18
0
07 Feb 2021
SCOP: Scientific Control for Reliable Neural Network Pruning
SCOP: Scientific Control for Reliable Neural Network Pruning
Yehui Tang
Yunhe Wang
Yixing Xu
Dacheng Tao
Chunjing Xu
Chao Xu
Chang Xu
AAML
39
166
0
21 Oct 2020
What is the State of Neural Network Pruning?
What is the State of Neural Network Pruning?
Davis W. Blalock
Jose Javier Gonzalez Ortiz
Jonathan Frankle
John Guttag
183
1,027
0
06 Mar 2020
1