Adversarial Attacks on Event-Based Pedestrian Detectors: A Physical Approach
Event cameras, known for their low latency and high dynamic range, show great potential in pedestrian detection applications. However, while recent research has primarily focused on improving detection accuracy, the robustness of event-based visual models against physical adversarial attacks has received limited attention. For example, adversarial physical objects, such as specific clothing patterns or accessories, can exploit inherent vulnerabilities in these systems, leading to misdetections or misclassifications. This study is the first to explore physical adversarial attacks on event-driven pedestrian detectors, specifically investigating whether certain clothing patterns worn by pedestrians can cause these detectors to fail, effectively rendering them unable to detect the person. To address this, we developed an end-to-end adversarial framework in the digital domain, framing the design of adversarial clothing textures as a 2D texture optimization problem. By crafting an effective adversarial loss function, the framework iteratively generates optimal textures through backpropagation. Our results demonstrate that the textures identified in the digital domain possess strong adversarial properties. Furthermore, we translated these digitally optimized textures into physical clothing and tested them in real-world scenarios, successfully demonstrating that the designed textures significantly degrade the performance of event-based pedestrian detection models. This work highlights the vulnerability of such models to physical adversarial attacks.
View on arXiv@article{lin2025_2503.00377, title={ Adversarial Attacks on Event-Based Pedestrian Detectors: A Physical Approach }, author={ Guixu Lin and Muyao Niu and Qingtian Zhu and Zhengwei Yin and Zhuoxiao Li and Shengfeng He and Yinqiang Zheng }, journal={arXiv preprint arXiv:2503.00377}, year={ 2025 } }