Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2106.07172
Cited By
Energy-efficient Knowledge Distillation for Spiking Neural Networks
14 June 2021
Dongjin Lee
Seongsik Park
Jongwan Kim
Wuhyeong Doh
Sungroh Yoon
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Energy-efficient Knowledge Distillation for Spiking Neural Networks"
3 / 3 papers shown
Title
Temporal Separation with Entropy Regularization for Knowledge Distillation in Spiking Neural Networks
Kairong Yu
Chengting Yu
Tianqing Zhang
Xiaochen Zhao
Shu Yang
Hongwei Wang
Qiang Zhang
Qi Xu
66
3
0
05 Mar 2025
CADE: Cosine Annealing Differential Evolution for Spiking Neural Network
Runhua Jiang
Guodong Du
Shuyang Yu
Yifei Guo
S. Goh
Ho-Kin Tang
39
3
0
04 Jun 2024
LaSNN: Layer-wise ANN-to-SNN Distillation for Effective and Efficient Training in Deep Spiking Neural Networks
Di Hong
Jiangrong Shen
Yu Qi
Yueming Wang
25
5
0
17 Apr 2023
1