69
0

Graph Adapter of EEG Foundation Models for Parameter Efficient Fine Tuning

Abstract

In diagnosing neurological disorders from electroencephalography (EEG) data, foundation models such as Transformers have been employed to capture temporal dynamics. Additionally, Graph Neural Networks (GNNs) are critical for representing the spatial relationships among EEG sensors. However, fine-tuning these large-scale models for both temporal and spatial features can be prohibitively large in computational cost, especially under the limited availability of labeled EEG datasets. We propose EEG-GraphAdapter (EGA), a parameter-efficient fine-tuning (PEFT) approach designed to address these challenges. EGA is integrated into a pre-trained temporal backbone model as a GNN-based module, freezing the backbone and allowing only the adapter to be fine-tuned. This enables the effective acquisition of EEG spatial representations, significantly reducing computational overhead and data requirements. Experimental evaluations on two healthcare-related downstream tasks-Major Depressive Disorder (MDD) and Abnormality Detection (TUAB)-show that EGA improves performance by up to 16.1% in F1-score compared with the backbone BENDR model, highlighting its potential for scalable and accurate EEG-based predictions.

View on arXiv
@article{suzumura2025_2411.16155,
  title={ Graph Adapter of EEG Foundation Models for Parameter Efficient Fine Tuning },
  author={ Toyotaro Suzumura and Hiroki Kanezashi and Shotaro Akahori },
  journal={arXiv preprint arXiv:2411.16155},
  year={ 2025 }
}
Comments on this paper