ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2102.03773
  4. Cited By
SeReNe: Sensitivity based Regularization of Neurons for Structured
  Sparsity in Neural Networks

SeReNe: Sensitivity based Regularization of Neurons for Structured Sparsity in Neural Networks

7 February 2021
Enzo Tartaglione
Andrea Bragagnolo
Francesco Odierna
A. Fiandrotti
Marco Grangetto
ArXivPDFHTML

Papers citing "SeReNe: Sensitivity based Regularization of Neurons for Structured Sparsity in Neural Networks"

5 / 5 papers shown
Title
Playing the Lottery With Concave Regularizers for Sparse Trainable Neural Networks
Playing the Lottery With Concave Regularizers for Sparse Trainable Neural Networks
Giulia Fracastoro
Sophie M. Fosson
Andrea Migliorati
G. Calafiore
38
1
0
19 Jan 2025
Compressing Explicit Voxel Grid Representations: fast NeRFs become also
  small
Compressing Explicit Voxel Grid Representations: fast NeRFs become also small
C. Deng
Enzo Tartaglione
GNN
21
52
0
23 Oct 2022
Towards Efficient Capsule Networks
Towards Efficient Capsule Networks
Riccardo Renzulli
Marco Grangetto
OCL
10
4
0
19 Aug 2022
The rise of the lottery heroes: why zero-shot pruning is hard
The rise of the lottery heroes: why zero-shot pruning is hard
Enzo Tartaglione
10
6
0
24 Feb 2022
Channel Pruning via Automatic Structure Search
Channel Pruning via Automatic Structure Search
Mingbao Lin
Rongrong Ji
Yu-xin Zhang
Baochang Zhang
Yongjian Wu
Yonghong Tian
68
240
0
23 Jan 2020
1