56
1

Mosaic: Data-Free Knowledge Distillation via Mixture-of-Experts for Heterogeneous Distributed Environments

Main:9 Pages
9 Figures
Bibliography:10 Pages
15 Tables
Appendix:24 Pages
Abstract

Federated Learning (FL) is a decentralized machine learning paradigm that enables clients to collaboratively train models while preserving data privacy. However, the coexistence of model and data heterogeneity gives rise to inconsistent representations and divergent optimization dynamics across clients, ultimately hindering robust global performance. To transcend these challenges, we propose Mosaic, a novel data-free knowledge distillation framework tailored for heterogeneous distributed environments. Mosaic first trains local generative models to approximate each client's personalized distribution, enabling synthetic data generation that safeguards privacy through strict separation from real data. Subsequently, Mosaic forms a Mixture-of-Experts (MoE) from client models based on their specialized knowledge, and distills it into a global model using the generated data. To further enhance the MoE architecture, Mosaic integrates expert predictions via a lightweight meta model trained on a few representative prototypes. Extensive experiments on standard image classification benchmarks demonstrate that Mosaic consistently outperforms state-of-the-art approaches under both model and data heterogeneity. The source code has been published atthis https URL.

View on arXiv
@article{liu2025_2505.19699,
  title={ Mosaic: Data-Free Knowledge Distillation via Mixture-of-Experts for Heterogeneous Distributed Environments },
  author={ Junming Liu and Yanting Gao and Siyuan Meng and Yifei Sun and Aoqi Wu and Yufei Jin and Yirong Chen and Ding Wang and Guosun Zeng },
  journal={arXiv preprint arXiv:2505.19699},
  year={ 2025 }
}
Comments on this paper