ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2308.12581
18
13

A Huber Loss Minimization Approach to Byzantine Robust Federated Learning

24 August 2023
Puning Zhao
Fei Yu
Zhiguo Wan
    FedML
ArXivPDFHTML
Abstract

Federated learning systems are susceptible to adversarial attacks. To combat this, we introduce a novel aggregator based on Huber loss minimization, and provide a comprehensive theoretical analysis. Under independent and identically distributed (i.i.d) assumption, our approach has several advantages compared to existing methods. Firstly, it has optimal dependence on ϵ\epsilonϵ, which stands for the ratio of attacked clients. Secondly, our approach does not need precise knowledge of ϵ\epsilonϵ. Thirdly, it allows different clients to have unequal data sizes. We then broaden our analysis to include non-i.i.d data, such that clients have slightly different distributions.

View on arXiv
Comments on this paper