51
0

GM-MoE: Low-Light Enhancement with Gated-Mechanism Mixture-of-Experts

Abstract

Low-light enhancement has wide applications in autonomous driving, 3D reconstruction, remote sensing, surveillance, and so on, which can significantly improve information utilization. However, most existing methods lack generalization and are limited to specific tasks such as image recovery. To address these issues, we propose Gated-Mechanism Mixture-of-Experts (GM-MoE), the first framework to introduce a mixture-of-experts network for low-light image enhancement. GM-MoE comprises a dynamic gated weight conditioning network and three sub-expert networks, each specializing in a distinct enhancement task. Combining a self-designed gated mechanism that dynamically adjusts the weights of the sub-expert networks for different data domains. Additionally, we integrate local and global feature fusion within sub-expert networks to enhance image quality by capturing multi-scale features. Experimental results demonstrate that the GM-MoE achieves superior generalization with respect to 25 compared approaches, reaching state-of-the-art performance on PSNR on 5 benchmarks and SSIM on 4 benchmarks, respectively.

View on arXiv
@article{liao2025_2503.07417,
  title={ GM-MoE: Low-Light Enhancement with Gated-Mechanism Mixture-of-Experts },
  author={ Minwen Liao and Hao Bo Dong and Xinyi Wang and Ziyang Yan and Yihua Shao },
  journal={arXiv preprint arXiv:2503.07417},
  year={ 2025 }
}
Comments on this paper