Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
2109.04939
Cited By
v1
v2
v3 (latest)
Modeling Human Sentence Processing with Left-Corner Recurrent Neural Network Grammars
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2021
10 September 2021
Ryo Yoshida
Hiroshi Noji
Yohei Oseki
Re-assign community
ArXiv (abs)
PDF
HTML
Github (11782★)
Papers citing
"Modeling Human Sentence Processing with Left-Corner Recurrent Neural Network Grammars"
5 / 5 papers shown
Large Language Models Are Human-Like Internally
Tatsuki Kuribayashi
Yohei Oseki
Souhaib Ben Taieb
Kentaro Inui
Timothy Baldwin
722
24
0
03 Feb 2025
Emergent Word Order Universals from Cognitively-Motivated Language Models
Tatsuki Kuribayashi
Ryo Ueda
Ryosuke Yoshida
Yohei Oseki
Ted Briscoe
Timothy Baldwin
374
10
0
19 Feb 2024
Psychometric Predictive Power of Large Language Models
Tatsuki Kuribayashi
Yohei Oseki
Timothy Baldwin
LM&MA
321
8
0
13 Nov 2023
Composition, Attention, or Both?
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2022
Ryosuke Yoshida
Yohei Oseki
CoGe
211
0
0
24 Oct 2022
Context Limitations Make Neural Language Models More Human-Like
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2022
Tatsuki Kuribayashi
Yohei Oseki
Ana Brassard
Kentaro Inui
509
50
0
23 May 2022
1
Page 1 of 1