ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2508.18663
  4. Cited By
FFT-MoE: Efficient Federated Fine-Tuning for Foundation Models via Large-scale Sparse MoE under Heterogeneous Edge

FFT-MoE: Efficient Federated Fine-Tuning for Foundation Models via Large-scale Sparse MoE under Heterogeneous Edge

26 August 2025
Gang Hu
Yinglei Teng
Pengfei Wu
Nan Wang
    MoE
ArXiv (abs)PDFHTML

Papers citing "FFT-MoE: Efficient Federated Fine-Tuning for Foundation Models via Large-scale Sparse MoE under Heterogeneous Edge"

1 / 1 papers shown
OvA-LP: A Simple and Efficient Framework for Federated Learning on Non-IID Data
OvA-LP: A Simple and Efficient Framework for Federated Learning on Non-IID Data
Dongjin Park
Hasung Yeo
Joon-Woo Lee
FedML
283
0
0
07 Nov 2025
1
Page 1 of 1