ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.17391
41
0

The Empirical Impact of Reducing Symmetries on the Performance of Deep Ensembles and MoE

24 February 2025
Andrei Chernov
Oleg Novitskij
ArXivPDFHTML
Abstract

Recent studies have shown that reducing symmetries in neural networks enhances linear mode connectivity between networks without requiring parameter space alignment, leading to improved performance in linearly interpolated neural networks. However, in practical applications, neural network interpolation is rarely used; instead, ensembles of networks are more common. In this paper, we empirically investigate the impact of reducing symmetries on the performance of deep ensembles and Mixture of Experts (MoE) across five datasets. Additionally, to explore deeper linear mode connectivity, we introduce the Mixture of Interpolated Experts (MoIE). Our results show that deep ensembles built on asymmetric neural networks achieve significantly better performance as ensemble size increases compared to their symmetric counterparts. In contrast, our experiments do not provide conclusive evidence on whether reducing symmetries affects both MoE and MoIE architectures.

View on arXiv
@article{chernov2025_2502.17391,
  title={ The Empirical Impact of Reducing Symmetries on the Performance of Deep Ensembles and MoE },
  author={ Andrei Chernov and Oleg Novitskij },
  journal={arXiv preprint arXiv:2502.17391},
  year={ 2025 }
}
Comments on this paper