193
v1v2v3v4v5 (latest)

Quantitative Attractor Analysis of High-Capacity Kernel Hopfield Networks

Main:15 Pages
7 Figures
Bibliography:1 Pages
Abstract

Kernel-based learning methods such as Kernel Logistic Regression (KLR) can substantially increase the storage capacity of Hopfield networks, but the principles governing their performance and stability remain largely uncharacterized. This paper presents a comprehensive quantitative analysis of the attractor landscape in KLR-trained networks to establish a solid foundation for their design and application. Through extensive, statistically validated simulations, we address critical questions of generality, scalability, and robustness. Our comparative analysis shows that KLR and Kernel Ridge Regression (KRR) exhibit similarly high storage capacities and clean attractor landscapes under typical operating conditions, suggesting that this behavior is a general property of kernel regression methods, although KRR is computationally much faster. We identify a non-trivial, scale-dependent law for the kernel width γ\gamma, demonstrating that optimal capacity requires γ\gamma to be scaled such that γN\gamma N increases with network size NN. This finding implies that larger networks require more localized kernels, in which each pattern's influence is more spatially confined, to mitigate inter-pattern interference. Under this optimized scaling, we provide clear evidence that storage capacity scales linearly with network size~(PNP \propto N). Furthermore, our sensitivity analysis shows that performance is remarkably robust with respect to the choice of the regularization parameter λ\lambda. Collectively, these findings provide a concise set of empirical principles for designing high-capacity and robust associative memories and clarify the mechanisms that enable kernel methods to overcome the classical limitations of Hopfield-type models.

View on arXiv
Comments on this paper