EEG2GAIT: A Hierarchical Graph Convolutional Network for EEG-based Gait Decoding

Decoding gait dynamics from EEG signals presents significant challenges due to the complex spatial dependencies of motor processes, the need for accurate temporal and spectral feature extraction, and the scarcity of high-quality gait EEG datasets. To address these issues, we propose EEG2GAIT, a novel hierarchical graph-based model that captures multi-level spatial embeddings of EEG channels using a Hierarchical Graph Convolutional Network (GCN) Pyramid. To further improve decoding accuracy, we introduce a Hybrid Temporal-Spectral Reward (HTSR) loss function, which combines time-domain, frequency-domain, and reward-based loss components. Moreover, we contribute a new Gait-EEG Dataset (GED), consisting of synchronized EEG and lower-limb joint angle data collected from 50 participants over two lab visits. Validation experiments on both the GED and the publicly available Mobile Brain-body imaging (MoBI) dataset demonstrate that EEG2GAIT outperforms state-of-the-art methods and achieves the best joint angle prediction. Ablation studies validate the contributions of the hierarchical GCN modules and HTSR Loss, while saliency maps reveal the significance of motor-related brain regions in decoding tasks. These findings underscore EEG2GAIT's potential for advancing brain-computer interface applications, particularly in lower-limb rehabilitation and assistive technologies.
View on arXiv@article{fu2025_2504.03757, title={ EEG2GAIT: A Hierarchical Graph Convolutional Network for EEG-based Gait Decoding }, author={ Xi Fu and Rui Liu and Aung Aung Phyo Wai and Hannah Pulferer and Neethu Robinson and Gernot R Müller-Putz and Cuntai Guan }, journal={arXiv preprint arXiv:2504.03757}, year={ 2025 } }