ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.16398
30
1

Federated Communication-Efficient Multi-Objective Optimization

21 October 2024
Baris Askin
Pranay Sharma
Gauri Joshi
Carlee Joe-Wong
    FedML
ArXivPDFHTML
Abstract

We study a federated version of multi-objective optimization (MOO), where a single model is trained to optimize multiple objective functions. MOO has been extensively studied in the centralized setting but is less explored in federated or distributed settings. We propose FedCMOO, a novel communication-efficient federated multi-objective optimization (FMOO) algorithm that improves the error convergence performance of the model compared to existing approaches. Unlike prior works, the communication cost of FedCMOO does not scale with the number of objectives, as each client sends a single aggregated gradient to the central server. We provide a convergence analysis of the proposed method for smooth and non-convex objective functions under milder assumptions than in prior work. In addition, we introduce a variant of FedCMOO that allows users to specify a preference over the objectives in terms of a desired ratio of the final objective values. Through extensive experiments, we demonstrate the superiority of our proposed method over baseline approaches.

View on arXiv
@article{askin2025_2410.16398,
  title={ Federated Communication-Efficient Multi-Objective Optimization },
  author={ Baris Askin and Pranay Sharma and Gauri Joshi and Carlee Joe-Wong },
  journal={arXiv preprint arXiv:2410.16398},
  year={ 2025 }
}
Comments on this paper