17
0

Emergence of Structure in Ensembles of Random Neural Networks

Abstract

Randomness is ubiquitous in many applications across data science and machine learning. Remarkably, systems composed of random components often display emergent global behaviors that appear deterministic, manifesting a transition from microscopic disorder to macroscopic organization. In this work, we introduce a theoretical model for studying the emergence of collective behaviors in ensembles of random classifiers. We argue that, if the ensemble is weighted through the Gibbs measure defined by adopting the classification loss as an energy, then there exists a finite temperature parameter for the distribution such that the classification is optimal, with respect to the loss (or the energy). Interestingly, for the case in which samples are generated by a Gaussian distribution and labels are constructed by employing a teacher perceptron, we analytically prove and numerically confirm that such optimal temperature does not depend neither on the teacher classifier (which is, by construction of the learning problem, unknown), nor on the number of random classifiers, highlighting the universal nature of the observed behavior. Experiments on the MNIST dataset underline the relevance of this phenomenon in high-quality, noiseless, datasets. Finally, a physical analogy allows us to shed light on the self-organizing nature of the studied phenomenon.

View on arXiv
@article{muscarnera2025_2505.10331,
  title={ Emergence of Structure in Ensembles of Random Neural Networks },
  author={ Luca Muscarnera and Luigi Loreti and Giovanni Todeschini and Alessio Fumagalli and Francesco Regazzoni },
  journal={arXiv preprint arXiv:2505.10331},
  year={ 2025 }
}
Comments on this paper