ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2311.13349
  4. Cited By
REDS: Resource-Efficient Deep Subnetworks for Dynamic Resource
  Constraints

REDS: Resource-Efficient Deep Subnetworks for Dynamic Resource Constraints

22 November 2023
Francesco Corti
Balz Maag
Joachim Schauer
U. Pferschy
O. Saukh
ArXivPDFHTML

Papers citing "REDS: Resource-Efficient Deep Subnetworks for Dynamic Resource Constraints"

4 / 4 papers shown
Title
Forget the Data and Fine-Tuning! Just Fold the Network to Compress
Forget the Data and Fine-Tuning! Just Fold the Network to Compress
Dong Wang
Haris Šikić
Lothar Thiele
O. Saukh
42
0
0
17 Feb 2025
Git Re-Basin: Merging Models modulo Permutation Symmetries
Git Re-Basin: Merging Models modulo Permutation Symmetries
Samuel K. Ainsworth
J. Hayase
S. Srinivasa
MoMe
239
313
0
11 Sep 2022
Sparsity in Deep Learning: Pruning and growth for efficient inference
  and training in neural networks
Sparsity in Deep Learning: Pruning and growth for efficient inference and training in neural networks
Torsten Hoefler
Dan Alistarh
Tal Ben-Nun
Nikoli Dryden
Alexandra Peste
MQ
131
679
0
31 Jan 2021
TensorFlow Lite Micro: Embedded Machine Learning on TinyML Systems
TensorFlow Lite Micro: Embedded Machine Learning on TinyML Systems
R. David
Jared Duke
Advait Jain
Vijay Janapa Reddi
Nat Jeffries
...
Meghna Natraj
Shlomi Regev
Rocky Rhodes
Tiezhen Wang
Pete Warden
100
461
0
17 Oct 2020
1