ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2406.03044
  4. Cited By
Population Transformer: Learning Population-level Representations of Neural Activity

Population Transformer: Learning Population-level Representations of Neural Activity

5 June 2024
Geeling Chau
Christopher Wang
Sabera Talukder
Vighnesh Subramaniam
Saraswati Soedarmadji
Yisong Yue
Boris Katz
Andrei Barbu
    MedIm
ArXivPDFHTML

Papers citing "Population Transformer: Learning Population-level Representations of Neural Activity"

4 / 4 papers shown
Title
TOTEM: TOkenized Time Series EMbeddings for General Time Series Analysis
TOTEM: TOkenized Time Series EMbeddings for General Time Series Analysis
Sabera Talukder
Yisong Yue
Georgia Gkioxari
AI4TS
40
12
0
03 Jan 2025
Du-IN: Discrete units-guided mask modeling for decoding speech from
  Intracranial Neural signals
Du-IN: Discrete units-guided mask modeling for decoding speech from Intracranial Neural signals
Hui Zheng
Haiteng Wang
Wei-Bang Jiang
Zhongtao Chen
Li He
Pei-Yang Lin
Peng-Hu Wei
Guo-Guang Zhao
Yun-Zhe Liu
35
1
0
19 May 2024
Generalizability Under Sensor Failure: Tokenization + Transformers
  Enable More Robust Latent Spaces
Generalizability Under Sensor Failure: Tokenization + Transformers Enable More Robust Latent Spaces
Geeling Chau
Yujin An
Ahamed Raffey Iqbal
Soon-Jo Chung
Yisong Yue
Sabera Talukder
OOD
27
4
0
28 Feb 2024
BENDR: using transformers and a contrastive self-supervised learning
  task to learn from massive amounts of EEG data
BENDR: using transformers and a contrastive self-supervised learning task to learn from massive amounts of EEG data
Demetres Kostas
Stephane Aroca-Ouellette
Frank Rudzicz
SSL
41
198
0
28 Jan 2021
1