ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.14218
34
1

Rethinking Spiking Neural Networks from an Ensemble Learning Perspective

20 February 2025
Yongqi Ding
Lin Zuo
Mengmeng Jing
Pei He
Hanpu Deng
ArXivPDFHTML
Abstract

Spiking neural networks (SNNs) exhibit superior energy efficiency but suffer from limited performance. In this paper, we consider SNNs as ensembles of temporal subnetworks that share architectures and weights, and highlight a crucial issue that affects their performance: excessive differences in initial states (neuronal membrane potentials) across timesteps lead to unstable subnetwork outputs, resulting in degraded performance. To mitigate this, we promote the consistency of the initial membrane potential distribution and output through membrane potential smoothing and temporally adjacent subnetwork guidance, respectively, to improve overall stability and performance. Moreover, membrane potential smoothing facilitates forward propagation of information and backward propagation of gradients, mitigating the notorious temporal gradient vanishing problem. Our method requires only minimal modification of the spiking neurons without adapting the network structure, making our method generalizable and showing consistent performance gains in 1D speech, 2D object, and 3D point cloud recognition tasks. In particular, on the challenging CIFAR10-DVS dataset, we achieved 83.20\% accuracy with only four timesteps. This provides valuable insights into unleashing the potential of SNNs.

View on arXiv
@article{ding2025_2502.14218,
  title={ Rethinking Spiking Neural Networks from an Ensemble Learning Perspective },
  author={ Yongqi Ding and Lin Zuo and Mengmeng Jing and Pei He and Hanpu Deng },
  journal={arXiv preprint arXiv:2502.14218},
  year={ 2025 }
}
Comments on this paper