DEIO: Deep Event Inertial Odometry

Event cameras show great potential for visual odometry (VO) in handling challenging situations, such as fast motion and high dynamic range. Despite this promise, the sparse and motion-dependent characteristics of event data continue to limit the performance of feature-based or direct-based data association methods in practical applications. To address these limitations, we propose Deep Event Inertial Odometry (DEIO), the first monocular learning-based event-inertial framework, which combines a learning-based method with traditional nonlinear graph-based optimization. Specifically, an event-based recurrent network is adopted to provide accurate and sparse associations of event patches over time. DEIO further integrates it with the IMU to recover up-to-scale pose and provide robust state estimation. The Hessian information derived from the learned differentiable bundle adjustment (DBA) is utilized to optimize the co-visibility factor graph, which tightly incorporates event patch correspondences and IMU pre-integration within a keyframe-based sliding window. Comprehensive validations demonstrate that DEIO achieves superior performance on \textit{10} challenging public benchmarks compared with more than 20 state-of-the-art methods.
View on arXiv@article{guan2025_2411.03928, title={ DEIO: Deep Event Inertial Odometry }, author={ Weipeng Guan and Fuling Lin and Peiyu Chen and Peng Lu }, journal={arXiv preprint arXiv:2411.03928}, year={ 2025 } }