Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2308.11654
Cited By
Large Transformers are Better EEG Learners
20 August 2023
Bingxin Wang
Xiao-Ying Fu
Yuan Lan
Luchan Zhang
Wei Zheng
Yang Xiang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Large Transformers are Better EEG Learners"
5 / 5 papers shown
Title
Training language models to follow instructions with human feedback
Long Ouyang
Jeff Wu
Xu Jiang
Diogo Almeida
Carroll L. Wainwright
...
Amanda Askell
Peter Welinder
Paul Christiano
Jan Leike
Ryan J. Lowe
OSLM
ALM
313
11,915
0
04 Mar 2022
Open Vocabulary Electroencephalography-To-Text Decoding and Zero-shot Sentiment Classification
Zhenhailong Wang
Heng Ji
84
71
0
05 Dec 2021
MobileViT: Light-weight, General-purpose, and Mobile-friendly Vision Transformer
Sachin Mehta
Mohammad Rastegari
ViT
212
1,212
0
05 Oct 2021
BENDR: using transformers and a contrastive self-supervised learning task to learn from massive amounts of EEG data
Demetres Kostas
Stephane Aroca-Ouellette
Frank Rudzicz
SSL
41
202
0
28 Jan 2021
Improved Baselines with Momentum Contrastive Learning
Xinlei Chen
Haoqi Fan
Ross B. Girshick
Kaiming He
SSL
267
3,371
0
09 Mar 2020
1