ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2204.02183
  4. Cited By
Optimising Communication Overhead in Federated Learning Using NSGA-II

Optimising Communication Overhead in Federated Learning Using NSGA-II

1 April 2022
José Á. Morell
Z. Dahi
Francisco Chicano
Gabriel Luque
Enrique Alba
    FedML
ArXivPDFHTML

Papers citing "Optimising Communication Overhead in Federated Learning Using NSGA-II"

4 / 4 papers shown
Title
How I Warped Your Noise: a Temporally-Correlated Noise Prior for Diffusion Models
How I Warped Your Noise: a Temporally-Correlated Noise Prior for Diffusion Models
Pascal Chang
Jingwei Tang
Markus Gross
Vinicius Azevedo
DiffM
30
20
0
03 Apr 2025
Optimizing Privacy, Utility and Efficiency in Constrained
  Multi-Objective Federated Learning
Optimizing Privacy, Utility and Efficiency in Constrained Multi-Objective Federated Learning
Yan Kang
Hanlin Gu
Xingxing Tang
Yuanqin He
Yuzhu Zhang
Jinnan He
Yuxing Han
Lixin Fan
Kai Chen
Qiang Yang
FedML
63
18
0
29 Apr 2023
Probably Approximately Correct Federated Learning
Probably Approximately Correct Federated Learning
Xiaojin Zhang
Anbu Huang
Lixin Fan
Kai Chen
Qiang Yang
FedML
17
5
0
10 Apr 2023
GTFLAT: Game Theory Based Add-On For Empowering Federated Learning
  Aggregation Techniques
GTFLAT: Game Theory Based Add-On For Empowering Federated Learning Aggregation Techniques
Hamidreza Mahini
H. Mousavi
Masoud Daneshtalab
FedML
12
1
0
08 Dec 2022
1