49
0

BAT: Learning Event-based Optical Flow with Bidirectional Adaptive Temporal Correlation

Abstract

Event cameras deliver visual information characterized by a high dynamic range and high temporal resolution, offering significant advantages in estimating optical flow for complex lighting conditions and fast-moving objects. Current advanced optical flow methods for event cameras largely adopt established image-based frameworks. However, the spatial sparsity of event data limits their performance. In this paper, we present BAT, an innovative framework that estimates event-based optical flow using bidirectional adaptive temporal correlation. BAT includes three novel designs: 1) a bidirectional temporal correlation that transforms bidirectional temporally dense motion cues into spatially dense ones, enabling accurate and spatially dense optical flow estimation; 2) an adaptive temporal sampling strategy for maintaining temporal consistency in correlation; 3) spatially adaptive temporal motion aggregation to efficiently and adaptively aggregate consistent target motion features into adjacent motion features while suppressing inconsistent ones. Our results rank 1st1^{st} on the DSEC-Flow benchmark, outperforming existing state-of-the-art methods by a large margin while also exhibiting sharp edges and high-quality details. Notably, our BAT can accurately predict future optical flow using only past events, significantly outperforming E-RAFT's warm-start approach. Code: \textcolor{magenta}{this https URL}.

View on arXiv
@article{xu2025_2503.03256,
  title={ BAT: Learning Event-based Optical Flow with Bidirectional Adaptive Temporal Correlation },
  author={ Gangwei Xu and Haotong Lin and Zhaoxing Zhang and Hongcheng Luo and Haiyang Sun and Xin Yang },
  journal={arXiv preprint arXiv:2503.03256},
  year={ 2025 }
}
Comments on this paper