ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2106.07172
  4. Cited By

Energy-efficient Knowledge Distillation for Spiking Neural Networks

14 June 2021
Dongjin Lee
Seongsik Park
Jongwan Kim
Wuhyeong Doh
Sungroh Yoon
ArXivPDFHTML

Papers citing "Energy-efficient Knowledge Distillation for Spiking Neural Networks"

3 / 3 papers shown
Title
Temporal Separation with Entropy Regularization for Knowledge Distillation in Spiking Neural Networks
Temporal Separation with Entropy Regularization for Knowledge Distillation in Spiking Neural Networks
Kairong Yu
Chengting Yu
Tianqing Zhang
Xiaochen Zhao
Shu Yang
Hongwei Wang
Qiang Zhang
Qi Xu
66
3
0
05 Mar 2025
CADE: Cosine Annealing Differential Evolution for Spiking Neural Network
CADE: Cosine Annealing Differential Evolution for Spiking Neural Network
Runhua Jiang
Guodong Du
Shuyang Yu
Yifei Guo
S. Goh
Ho-Kin Tang
39
3
0
04 Jun 2024
LaSNN: Layer-wise ANN-to-SNN Distillation for Effective and Efficient
  Training in Deep Spiking Neural Networks
LaSNN: Layer-wise ANN-to-SNN Distillation for Effective and Efficient Training in Deep Spiking Neural Networks
Di Hong
Jiangrong Shen
Yu Qi
Yueming Wang
25
5
0
17 Apr 2023
1