ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.03156
  4. Cited By
FedHyper: A Universal and Robust Learning Rate Scheduler for Federated
  Learning with Hypergradient Descent

FedHyper: A Universal and Robust Learning Rate Scheduler for Federated Learning with Hypergradient Descent

4 October 2023
Ziyao Wang
Jianyu Wang
Ang Li
    FedML
ArXivPDFHTML

Papers citing "FedHyper: A Universal and Robust Learning Rate Scheduler for Federated Learning with Hypergradient Descent"

1 / 1 papers shown
Title
FLoRA: Federated Fine-Tuning Large Language Models with Heterogeneous
  Low-Rank Adaptations
FLoRA: Federated Fine-Tuning Large Language Models with Heterogeneous Low-Rank Adaptations
Ziyao Wang
Zheyu Shen
Yexiao He
Guoheng Sun
Hongyi Wang
Lingjuan Lyu
Ang Li
26
25
0
09 Sep 2024
1