150
v1v2v3 (latest)

Dynamic Manifold Hopfield Networks for Context-Dependent Associative Memory

Main:13 Pages
4 Figures
Bibliography:3 Pages
2 Tables
Abstract

Neural population activity in cortical and hippocampal circuits can be flexibly reorganized by context, suggesting that cognition relies on dynamic manifolds rather than static representations. However, how such dynamic organization can be realized mechanistically within a unified dynamical system remains unclear. Continuous Hopfield networks provide a classical attractor framework in which neural dynamics follow gradient descent on a fixed energy landscape, constraining retrieval within a static attractor manifold geometry. Extending this approach, we introduce Dynamic Manifold Hopfield Networks (DMHN), continuous dynamical models in which contextual modulation dynamically reshapes attractor geometry, transforming a static attractor manifold into a context-dependent family of neural manifolds. In DMHN, network interactions are learned in a data-driven manner, to intrinsically deform the geometry of its attractor manifold across cues without explicit context-specific parameterization. As a result, in associative retrieval, DMHN achieve substantially higher capacity and robustness than classical and modern Hopfield networks: when storing 2N2N patterns in a network of NN neurons, DMHN attain reliable retrieval with an average accuracy of 64%, compared with 1% and 13% for classical and modern variants, respectively. Together, these results establish dynamic reorganization of attractor manifold geometry as a principled mechanism for context-dependent remapping in neural associative memory.

View on arXiv
Comments on this paper