ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.02544
73
0

Addressing Label Shift in Distributed Learning via Entropy Regularization

4 February 2025
Zhiyuan Wu
Changkyu Choi
Xiangcheng Cao
V. Cevher
Ali Ramezani-Kebrya
ArXivPDFHTML
Abstract

We address the challenge of minimizing true risk in multi-node distributed learning. These systems are frequently exposed to both inter-node and intra-node label shifts, which present a critical obstacle to effectively optimizing model performance while ensuring that data remains confined to each node. To tackle this, we propose the Versatile Robust Label Shift (VRLS) method, which enhances the maximum likelihood estimation of the test-to-train label density ratio. VRLS incorporates Shannon entropy-based regularization and adjusts the density ratio during training to better handle label shifts at the test time. In multi-node learning environments, VRLS further extends its capabilities by learning and adapting density ratios across nodes, effectively mitigating label shifts and improving overall model performance. Experiments conducted on MNIST, Fashion MNIST, and CIFAR-10 demonstrate the effectiveness of VRLS, outperforming baselines by up to 20% in imbalanced settings. These results highlight the significant improvements VRLS offers in addressing label shifts. Our theoretical analysis further supports this by establishing high-probability bounds on estimation errors.

View on arXiv
@article{wu2025_2502.02544,
  title={ Addressing Label Shift in Distributed Learning via Entropy Regularization },
  author={ Zhiyuan Wu and Changkyu Choi and Xiangcheng Cao and Volkan Cevher and Ali Ramezani-Kebrya },
  journal={arXiv preprint arXiv:2502.02544},
  year={ 2025 }
}
Comments on this paper