ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2403.16552
  4. Cited By
QKFormer: Hierarchical Spiking Transformer using Q-K Attention

QKFormer: Hierarchical Spiking Transformer using Q-K Attention

25 March 2024
Chenlin Zhou
Han Zhang
Zhaokun Zhou
Liutao Yu
Liwei Huang
Xiaopeng Fan
Liuliang Yuan
Zhengyu Ma
Huihui Zhou
Yonghong Tian
ArXivPDFHTML

Papers citing "QKFormer: Hierarchical Spiking Transformer using Q-K Attention"

5 / 5 papers shown
Title
Optimal ANN-SNN Conversion for High-accuracy and Ultra-low-latency
  Spiking Neural Networks
Optimal ANN-SNN Conversion for High-accuracy and Ultra-low-latency Spiking Neural Networks
Tong Bu
Wei Fang
Jianhao Ding
Penglin Dai
Zhaofei Yu
Tiejun Huang
102
191
0
08 Mar 2023
Spikformer: When Spiking Neural Network Meets Transformer
Spikformer: When Spiking Neural Network Meets Transformer
Zhaokun Zhou
Yuesheng Zhu
Chao He
Yaowei Wang
Shuicheng Yan
Yonghong Tian
Liuliang Yuan
143
231
0
29 Sep 2022
Masked Autoencoders Are Scalable Vision Learners
Masked Autoencoders Are Scalable Vision Learners
Kaiming He
Xinlei Chen
Saining Xie
Yanghao Li
Piotr Dollár
Ross B. Girshick
ViT
TPM
258
7,337
0
11 Nov 2021
Training Feedback Spiking Neural Networks by Implicit Differentiation on
  the Equilibrium State
Training Feedback Spiking Neural Networks by Implicit Differentiation on the Equilibrium State
Mingqing Xiao
Qingyan Meng
Zongpeng Zhang
Yisen Wang
Zhouchen Lin
98
59
0
29 Sep 2021
Deep Residual Learning in Spiking Neural Networks
Deep Residual Learning in Spiking Neural Networks
Wei Fang
Zhaofei Yu
Yanqing Chen
Tiejun Huang
T. Masquelier
Yonghong Tian
121
470
0
08 Feb 2021
1