ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2406.17887
  4. Cited By
Federated Dynamical Low-Rank Training with Global Loss Convergence
  Guarantees

Federated Dynamical Low-Rank Training with Global Loss Convergence Guarantees

25 June 2024
Steffen Schotthöfer
M. P. Laiu
    FedML
ArXivPDFHTML

Papers citing "Federated Dynamical Low-Rank Training with Global Loss Convergence Guarantees"

5 / 5 papers shown
Title
Low-rank lottery tickets: finding efficient low-rank neural networks via
  matrix differential equations
Low-rank lottery tickets: finding efficient low-rank neural networks via matrix differential equations
Steffen Schotthöfer
Emanuele Zangrando
J. Kusch
Gianluca Ceruti
Francesco Tudisco
50
30
0
26 May 2022
Initialization and Regularization of Factorized Neural Layers
Initialization and Regularization of Factorized Neural Layers
M. Khodak
Neil A. Tenenholtz
Lester W. Mackey
Nicolò Fusi
63
56
0
03 May 2021
Communication-Efficient Federated Learning with Dual-Side Low-Rank
  Compression
Communication-Efficient Federated Learning with Dual-Side Low-Rank Compression
Zhefeng Qiao
Xianghao Yu
Jun Zhang
Khaled B. Letaief
FedML
28
19
0
26 Apr 2021
Linear Convergence in Federated Learning: Tackling Client Heterogeneity
  and Sparse Gradients
Linear Convergence in Federated Learning: Tackling Client Heterogeneity and Sparse Gradients
A. Mitra
Rayana H. Jaafar
George J. Pappas
Hamed Hassani
FedML
55
157
0
14 Feb 2021
FedPAQ: A Communication-Efficient Federated Learning Method with
  Periodic Averaging and Quantization
FedPAQ: A Communication-Efficient Federated Learning Method with Periodic Averaging and Quantization
Amirhossein Reisizadeh
Aryan Mokhtari
Hamed Hassani
Ali Jadbabaie
Ramtin Pedarsani
FedML
157
756
0
28 Sep 2019
1