ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.05063
54
0

Lightweight Hypercomplex MRI Reconstruction: A Generalized Kronecker-Parameterized Approach

13 March 2025
H. Zhang
Jiahao Huang
Yinzhe Wu
Congren Dai
Fanwen Wang
Zhenxuan Zhang
Guang Yang
ArXivPDFHTML
Abstract

Magnetic Resonance Imaging (MRI) is crucial for clinical diagnostics but is hindered by prolonged scan times. Current deep learning models enhance MRI reconstruction but are often memory-intensive and unsuitable for resource-limited systems. This paper introduces a lightweight MRI reconstruction model leveraging Kronecker-Parameterized Hypercomplex Neural Networks to achieve high performance with reduced parameters. By integrating Kronecker-based modules, including Kronecker MLP, Kronecker Window Attention, and Kronecker Convolution, the proposed model efficiently extracts spatial features while preserving representational power. We introduce Kronecker U-Net and Kronecker SwinMR, which maintain high reconstruction quality with approximately 50% fewer parameters compared to existing models. Experimental evaluation on the FastMRI dataset demonstrates competitive PSNR, SSIM, and LPIPS metrics, even at high acceleration factors (8x and 16x), with no significant performance drop. Additionally, Kronecker variants exhibit superior generalization and reduced overfitting on limited datasets, facilitating efficient MRI reconstruction on hardware-constrained systems. This approach sets a new benchmark for parameter-efficient medical imaging models.

View on arXiv
@article{zhang2025_2503.05063,
  title={ Lightweight Hypercomplex MRI Reconstruction: A Generalized Kronecker-Parameterized Approach },
  author={ Haosen Zhang and Jiahao Huang and Yinzhe Wu and Congren Dai and Fanwen Wang and Zhenxuan Zhang and Guang Yang },
  journal={arXiv preprint arXiv:2503.05063},
  year={ 2025 }
}
Comments on this paper