Spike-timing-dependent Hebbian learning as noisy gradient descent

Abstract
Hebbian learning is a key principle underlying learning in biological neural networks. It postulates that synaptic changes occur locally, depending on the activities of pre- and postsynaptic neurons. While Hebbian learning based on neuronal firing rates is well explored, much less is known about learning rules that account for precise spike-timing. We relate a Hebbian spike-timing-dependent plasticity rule to noisy gradient descent with respect to a natural loss function on the probability simplex. This connection allows us to prove that the learning rule eventually identifies the presynaptic neuron with the highest activity. We also discover an intrinsic connection to noisy mirror descent.
View on arXiv@article{dexheimer2025_2505.10272, title={ Spike-timing-dependent Hebbian learning as noisy gradient descent }, author={ Niklas Dexheimer and Sascha Gaudlitz and Johannes Schmidt-Hieber }, journal={arXiv preprint arXiv:2505.10272}, year={ 2025 } }
Comments on this paper