ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2406.15109
  4. Cited By
Brain-Like Language Processing via a Shallow Untrained Multihead
  Attention Network

Brain-Like Language Processing via a Shallow Untrained Multihead Attention Network

21 June 2024
Badr AlKhamissi
Greta Tuckute
Antoine Bosselut
Martin Schrimpf
ArXivPDFHTML

Papers citing "Brain-Like Language Processing via a Shallow Untrained Multihead Attention Network"

4 / 4 papers shown
Title
Model Connectomes: A Generational Approach to Data-Efficient Language Models
Model Connectomes: A Generational Approach to Data-Efficient Language Models
Klemen Kotar
Greta Tuckute
33
60
0
29 Apr 2025
What Are Large Language Models Mapping to in the Brain? A Case Against
  Over-Reliance on Brain Scores
What Are Large Language Models Mapping to in the Brain? A Case Against Over-Reliance on Brain Scores
Ebrahim Feghhi
Nima Hadidi
Bryan Song
I. Blank
Jonathan C. Kao
27
1
0
03 Jun 2024
Duality of Bures and Shape Distances with Implications for Comparing
  Neural Representations
Duality of Bures and Shape Distances with Implications for Comparing Neural Representations
Sarah E. Harvey
Brett W. Larsen
Alex H. Williams
65
10
0
19 Nov 2023
Inductive biases, pretraining and fine-tuning jointly account for brain
  responses to speech
Inductive biases, pretraining and fine-tuning jointly account for brain responses to speech
Juliette Millet
J. King
24
30
0
25 Feb 2021
1