19
0

Bi-directional Self-Registration for Misaligned Infrared-Visible Image Fusion

Abstract

Acquiring accurately aligned multi-modal image pairs is fundamental for achieving high-quality multi-modal image fusion. To address the lack of ground truth in current multi-modal image registration and fusion methods, we propose a novel self-supervised \textbf{B}i-directional \textbf{S}elf-\textbf{R}egistration framework (\textbf{B-SR}). Specifically, B-SR utilizes a proxy data generator (PDG) and an inverse proxy data generator (IPDG) to achieve self-supervised global-local registration. Visible-infrared image pairs with spatially misaligned differences are aligned to obtain global differences through the registration module. The same image pairs are processed by PDG, such as cropping, flipping, stitching, etc., and then aligned to obtain local differences. IPDG converts the obtained local differences into pseudo-global differences, which are used to perform global-local difference consistency with the global differences. Furthermore, aiming at eliminating the effect of modal gaps on the registration module, we design a neighborhood dynamic alignment loss to achieve cross-modal image edge alignment. Extensive experiments on misaligned multi-modal images demonstrate the effectiveness of the proposed method in multi-modal image alignment and fusion against the competing methods. Our code will be publicly available.

View on arXiv
@article{li2025_2505.06920,
  title={ Bi-directional Self-Registration for Misaligned Infrared-Visible Image Fusion },
  author={ Timing Li and Bing Cao and Pengfei Zhu and Bin Xiao and Qinghua Hu },
  journal={arXiv preprint arXiv:2505.06920},
  year={ 2025 }
}
Comments on this paper