Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2403.16552
Cited By
QKFormer: Hierarchical Spiking Transformer using Q-K Attention
25 March 2024
Chenlin Zhou
Han Zhang
Zhaokun Zhou
Liutao Yu
Liwei Huang
Xiaopeng Fan
Liuliang Yuan
Zhengyu Ma
Huihui Zhou
Yonghong Tian
Re-assign community
ArXiv
PDF
HTML
Papers citing
"QKFormer: Hierarchical Spiking Transformer using Q-K Attention"
5 / 5 papers shown
Title
Optimal ANN-SNN Conversion for High-accuracy and Ultra-low-latency Spiking Neural Networks
Tong Bu
Wei Fang
Jianhao Ding
Penglin Dai
Zhaofei Yu
Tiejun Huang
105
191
0
08 Mar 2023
Spikformer: When Spiking Neural Network Meets Transformer
Zhaokun Zhou
Yuesheng Zhu
Chao He
Yaowei Wang
Shuicheng Yan
Yonghong Tian
Liuliang Yuan
143
231
0
29 Sep 2022
Masked Autoencoders Are Scalable Vision Learners
Kaiming He
Xinlei Chen
Saining Xie
Yanghao Li
Piotr Dollár
Ross B. Girshick
ViT
TPM
258
7,337
0
11 Nov 2021
Training Feedback Spiking Neural Networks by Implicit Differentiation on the Equilibrium State
Mingqing Xiao
Qingyan Meng
Zongpeng Zhang
Yisen Wang
Zhouchen Lin
98
59
0
29 Sep 2021
Deep Residual Learning in Spiking Neural Networks
Wei Fang
Zhaofei Yu
Yanqing Chen
Tiejun Huang
T. Masquelier
Yonghong Tian
121
470
0
08 Feb 2021
1