ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2510.12102
84
0

SpikePool: Event-driven Spiking Transformer with Pooling Attention

14 October 2025
Donghyun Lee
Alex Sima
Yuhang Li
Panos Stinis
Priyadarshini Panda
ArXiv (abs)PDFHTML
Main:7 Pages
9 Figures
Bibliography:2 Pages
6 Tables
Appendix:3 Pages
Abstract

Building on the success of transformers, Spiking Neural Networks (SNNs) have increasingly been integrated with transformer architectures, leading to spiking transformers that demonstrate promising performance on event-based vision tasks. However, despite these empirical successes, there remains limited understanding of how spiking transformers fundamentally process event-based data. Current approaches primarily focus on architectural modifications without analyzing the underlying signal processing characteristics. In this work, we analyze spiking transformers through the frequency spectrum domain and discover that they behave as high-pass filters, contrasting with Vision Transformers (ViTs) that act as low-pass filters. This frequency domain analysis reveals why certain designs work well for event-based data, which contains valuable high-frequency information but is also sparse and noisy. Based on this observation, we propose SpikePool, which replaces spike-based self-attention with max pooling attention, a low-pass filtering operation, to create a selective band-pass filtering effect. This design preserves meaningful high-frequency content while capturing critical features and suppressing noise, achieving a better balance for event-based data processing. Our approach demonstrates competitive results on event-based datasets for both classification and object detection tasks while significantly reducing training and inference time by up to 42.5% and 32.8%, respectively.

View on arXiv
Comments on this paper