ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.15172
60
0

BP-GPT: Auditory Neural Decoding Using fMRI-prompted LLM

24 February 2025
Xiaoyu Chen
Changde Du
Che Liu
Yizhe Wang
Huiguang He
ArXivPDFHTML
Abstract

Decoding language information from brain signals represents a vital research area within brain-computer interfaces, particularly in the context of deciphering the semantic information from the fMRI signal. Although existing work uses LLM to achieve this goal, their method does not use an end-to-end approach and avoids the LLM in the mapping of fMRI-to-text, leaving space for the exploration of the LLM in auditory decoding. In this paper, we introduce a novel method, the Brain Prompt GPT (BP-GPT). By using the brain representation that is extracted from the fMRI as a prompt, our method can utilize GPT-2 to decode fMRI signals into stimulus text. Further, we introduce the text prompt and align the fMRI prompt to it. By introducing the text prompt, our BP-GPT can extract a more robust brain prompt and promote the decoding of pre-trained LLM. We evaluate our BP-GPT on the open-source auditory semantic decoding dataset and achieve a significant improvement up to 4.61 on METEOR and 2.43 on BERTScore across all the subjects compared to the state-of-the-art method. The experimental results demonstrate that using brain representation as a prompt to further drive LLM for auditory neural decoding is feasible and effective. The code is available atthis https URL.

View on arXiv
@article{chen2025_2502.15172,
  title={ BP-GPT: Auditory Neural Decoding Using fMRI-prompted LLM },
  author={ Xiaoyu Chen and Changde Du and Che Liu and Yizhe Wang and Huiguang He },
  journal={arXiv preprint arXiv:2502.15172},
  year={ 2025 }
}
Comments on this paper