ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.05527
24
0

ADMM-Based Training for Spiking Neural Networks

8 May 2025
Giovanni Perin
Cesare Bidini
Riccardo Mazzieri
M. Rossi
ArXivPDFHTML
Abstract

In recent years, spiking neural networks (SNNs) have gained momentum due to their high potential in time-series processing combined with minimal energy consumption. However, they still lack a dedicated and efficient training algorithm. The popular backpropagation with surrogate gradients, adapted from stochastic gradient descent (SGD)-derived algorithms, has several drawbacks when used as an optimizer for SNNs. Specifically, it suffers from low scalability and numerical imprecision. In this paper, we propose a novel SNN training method based on the alternating direction method of multipliers (ADMM). Our ADMM-based training aims to solve the problem of the SNN step function's non-differentiability. We formulate the problem, derive closed-form updates, and empirically show the optimizer's convergence properties, great potential, and possible new research directions to improve the method in a simulated proof-of-concept.

View on arXiv
@article{perin2025_2505.05527,
  title={ ADMM-Based Training for Spiking Neural Networks },
  author={ Giovanni Perin and Cesare Bidini and Riccardo Mazzieri and Michele Rossi },
  journal={arXiv preprint arXiv:2505.05527},
  year={ 2025 }
}
Comments on this paper