13
0

Symmetry in Neural Network Parameter Spaces

Main:21 Pages
10 Figures
Bibliography:8 Pages
2 Tables
Abstract

Modern deep learning models are highly overparameterized, resulting in large sets of parameter configurations that yield the same outputs. A significant portion of this redundancy is explained by symmetries in the parameter space--transformations that leave the network function unchanged. These symmetries shape the loss landscape and constrain learning dynamics, offering a new lens for understanding optimization, generalization, and model complexity that complements existing theory of deep learning. This survey provides an overview of parameter space symmetry. We summarize existing literature, uncover connections between symmetry and learning theory, and identify gaps and opportunities in this emerging field.

View on arXiv
@article{zhao2025_2506.13018,
  title={ Symmetry in Neural Network Parameter Spaces },
  author={ Bo Zhao and Robin Walters and Rose Yu },
  journal={arXiv preprint arXiv:2506.13018},
  year={ 2025 }
}
Comments on this paper