ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2405.09394
  4. Cited By
SPD-CFL: Stepwise Parameter Dropout for Efficient Continual Federated Learning
v1v2 (latest)

SPD-CFL: Stepwise Parameter Dropout for Efficient Continual Federated Learning

15 May 2024
Yuning Yang
H. Yu
Tianrun Gao
Tianrun Gao
Xiaohong Liu
Xiaodong Xu
Ping Zhang
Guangyu Wang
ArXiv (abs)PDFHTML

Papers citing "SPD-CFL: Stepwise Parameter Dropout for Efficient Continual Federated Learning"

3 / 3 papers shown
Title
FedRPCA: Enhancing Federated LoRA Aggregation Using Robust PCA
FedRPCA: Enhancing Federated LoRA Aggregation Using Robust PCA
Divyansh Jhunjhunwala
Arian Raje
Madan Ravi Ganesh
Chaithanya Kumar Mummadi
Chaoqun Dong
Jiawei Zhou
Wan-Yi Lin
Gauri Joshi
Zhenzhen Li
215
0
0
01 Jun 2025
A Survey on Federated Fine-tuning of Large Language Models
A Survey on Federated Fine-tuning of Large Language Models
Yebo Wu
Chunlin Tian
Jingguang Li
He Sun
Kahou Tam
Zhanting Zhou
Haicheng Liao
Zhijiang Guo
Li Li
Chengzhong Xu
FedML
371
5
0
15 Mar 2025
DEeR: Deviation Eliminating and Noise Regulating for Privacy-preserving
  Federated Low-rank Adaptation
DEeR: Deviation Eliminating and Noise Regulating for Privacy-preserving Federated Low-rank AdaptationIEEE Transactions on Medical Imaging (IEEE TMI), 2024
Meilu Zhu
Axiu Mao
Jun Liu
Yixuan Yuan
181
6
0
16 Oct 2024
1