ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1909.10364
  4. Cited By
Class-dependent Compression of Deep Neural Networks

Class-dependent Compression of Deep Neural Networks

23 September 2019
R. Entezari
O. Saukh
ArXivPDFHTML

Papers citing "Class-dependent Compression of Deep Neural Networks"

4 / 4 papers shown
Title
Forget the Data and Fine-Tuning! Just Fold the Network to Compress
Forget the Data and Fine-Tuning! Just Fold the Network to Compress
Dong Wang
Haris Šikić
Lothar Thiele
O. Saukh
68
0
0
17 Feb 2025
REDS: Resource-Efficient Deep Subnetworks for Dynamic Resource
  Constraints
REDS: Resource-Efficient Deep Subnetworks for Dynamic Resource Constraints
Francesco Corti
Balz Maag
Joachim Schauer
U. Pferschy
O. Saukh
49
2
0
22 Nov 2023
Studying the impact of magnitude pruning on contrastive learning methods
Studying the impact of magnitude pruning on contrastive learning methods
Francesco Corti
R. Entezari
Sara Hooker
D. Bacciu
O. Saukh
23
5
0
01 Jul 2022
Understanding the effect of sparsity on neural networks robustness
Understanding the effect of sparsity on neural networks robustness
Lukas Timpl
R. Entezari
Hanie Sedghi
Behnam Neyshabur
O. Saukh
51
12
0
22 Jun 2022
1