Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2406.07862
Cited By
Self-Distillation Learning Based on Temporal-Spatial Consistency for Spiking Neural Networks
12 June 2024
Lin Zuo
Yongqi Ding
Mengmeng Jing
Kunshan Yang
Yunqian Yu
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Self-Distillation Learning Based on Temporal-Spatial Consistency for Spiking Neural Networks"
6 / 6 papers shown
Title
NORM: Knowledge Distillation via N-to-One Representation Matching
Xiaolong Liu
Lujun Li
Chao Li
Anbang Yao
33
66
0
23 May 2023
Joint A-SNN: Joint Training of Artificial and Spiking Neural Networks via Self-Distillation and Weight Factorization
Yu-Zhu Guo
Weihang Peng
Y. Chen
Liwen Zhang
Xiaode Liu
Xuhui Huang
Zhe Ma
92
34
0
03 May 2023
Spikformer: When Spiking Neural Network Meets Transformer
Zhaokun Zhou
Yuesheng Zhu
Chao He
Yaowei Wang
Shuicheng Yan
Yonghong Tian
Liuliang Yuan
140
231
0
29 Sep 2022
Temporal Efficient Training of Spiking Neural Network via Gradient Re-weighting
Shi-Wee Deng
Yuhang Li
Shanghang Zhang
Shi Gu
109
240
0
24 Feb 2022
AutoSNN: Towards Energy-Efficient Spiking Neural Networks
Byunggook Na
J. Mok
Seongsik Park
Dongjin Lee
Hyeokjun Choe
Sungroh Yoon
40
63
0
30 Jan 2022
Deep Residual Learning in Spiking Neural Networks
Wei Fang
Zhaofei Yu
Yanqing Chen
Tiejun Huang
T. Masquelier
Yonghong Tian
119
470
0
08 Feb 2021
1