ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2308.11654
  4. Cited By
Large Transformers are Better EEG Learners

Large Transformers are Better EEG Learners

20 August 2023
Bingxin Wang
Xiao-Ying Fu
Yuan Lan
Luchan Zhang
Wei Zheng
Yang Xiang
ArXivPDFHTML

Papers citing "Large Transformers are Better EEG Learners"

5 / 5 papers shown
Title
Training language models to follow instructions with human feedback
Training language models to follow instructions with human feedback
Long Ouyang
Jeff Wu
Xu Jiang
Diogo Almeida
Carroll L. Wainwright
...
Amanda Askell
Peter Welinder
Paul Christiano
Jan Leike
Ryan J. Lowe
OSLM
ALM
313
11,953
0
04 Mar 2022
Open Vocabulary Electroencephalography-To-Text Decoding and Zero-shot
  Sentiment Classification
Open Vocabulary Electroencephalography-To-Text Decoding and Zero-shot Sentiment Classification
Zhenhailong Wang
Heng Ji
84
71
0
05 Dec 2021
MobileViT: Light-weight, General-purpose, and Mobile-friendly Vision
  Transformer
MobileViT: Light-weight, General-purpose, and Mobile-friendly Vision Transformer
Sachin Mehta
Mohammad Rastegari
ViT
215
1,213
0
05 Oct 2021
BENDR: using transformers and a contrastive self-supervised learning
  task to learn from massive amounts of EEG data
BENDR: using transformers and a contrastive self-supervised learning task to learn from massive amounts of EEG data
Demetres Kostas
Stephane Aroca-Ouellette
Frank Rudzicz
SSL
41
202
0
28 Jan 2021
Improved Baselines with Momentum Contrastive Learning
Improved Baselines with Momentum Contrastive Learning
Xinlei Chen
Haoqi Fan
Ross B. Girshick
Kaiming He
SSL
267
3,371
0
09 Mar 2020
1