Skull stripping is a common preprocessing step that is often performed manually in Magnetic Resonance Imaging (MRI) pipelines, including functional MRI (fMRI). This manual process is time-consuming and operator dependent. Automating this process is challenging for preclinical data due to variations in brain geometry, resolution, and tissue contrast. While existing methods for MRI skull stripping exist, they often struggle with the low resolution and varying slice sizes in preclinical fMRI data. This study proposes a novel method called SST-DUNet, that integrates a dense UNet-based architecture with a feature extractor based on Smart Swin Transformer (SST) for fMRI skull stripping. The Smart Shifted Window Multi-Head Self-Attention (SSW-MSA) module in SST is adapted to replace the mask-based module in the Swin Transformer (ST), enabling the learning of distinct channel-wise features while focusing on relevant dependencies within brain structures. This modification allows the model to better handle the complexities of fMRI skull stripping, such as low resolution and variable slice sizes. To address the issue of class imbalance in preclinical data, a combined loss function using Focal and Dice loss is utilized. The model was trained on rat fMRI images and evaluated across three in-house datasets with a Dice similarity score of 98.65%, 97.86%, and 98.04%. The fMRI results obtained through automatic skull stripping using the SST-DUNet model closely align with those from manual skull stripping for both seed-based and independent component analyses. These results indicate that the SST-DUNet can effectively substitute manual brain extraction in rat fMRI analysis.
View on arXiv@article{soltanpour2025_2504.19937, title={ SST-DUNet: Automated preclinical functional MRI skull stripping using Smart Swin Transformer and Dense UNet }, author={ Sima Soltanpour and Rachel Utama and Arnold Chang and Md Taufiq Nasseef and Dan Madularu and Praveen Kulkarni and Craig Ferris and Chris Joslin }, journal={arXiv preprint arXiv:2504.19937}, year={ 2025 } }