ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.08670
  4. Cited By
Every Parameter Matters: Ensuring the Convergence of Federated Learning
  with Dynamic Heterogeneous Models Reduction

Every Parameter Matters: Ensuring the Convergence of Federated Learning with Dynamic Heterogeneous Models Reduction

12 October 2023
Hanhan Zhou
Tian-Shing Lan
Guru Venkataramani
Wenbo Ding
ArXivPDFHTML

Papers citing "Every Parameter Matters: Ensuring the Convergence of Federated Learning with Dynamic Heterogeneous Models Reduction"

4 / 4 papers shown
Title
MAC-PO: Multi-Agent Experience Replay via Collective Priority
  Optimization
MAC-PO: Multi-Agent Experience Replay via Collective Priority Optimization
Yongsheng Mei
Hanhan Zhou
Tian-Shing Lan
Guru Venkataramani
Peng Wei
33
38
0
21 Feb 2023
FjORD: Fair and Accurate Federated Learning under heterogeneous targets
  with Ordered Dropout
FjORD: Fair and Accurate Federated Learning under heterogeneous targets with Ordered Dropout
Samuel Horváth
Stefanos Laskaridis
Mario Almeida
Ilias Leondiadis
Stylianos I. Venieris
Nicholas D. Lane
176
267
0
26 Feb 2021
What is the State of Neural Network Pruning?
What is the State of Neural Network Pruning?
Davis W. Blalock
Jose Javier Gonzalez Ortiz
Jonathan Frankle
John Guttag
178
1,027
0
06 Mar 2020
Adaptive Federated Learning in Resource Constrained Edge Computing
  Systems
Adaptive Federated Learning in Resource Constrained Edge Computing Systems
Shiqiang Wang
Tiffany Tuor
Theodoros Salonidis
K. Leung
C. Makaya
T. He
Kevin S. Chan
144
1,685
0
14 Apr 2018
1