ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2109.04939
  4. Cited By
Modeling Human Sentence Processing with Left-Corner Recurrent Neural
  Network Grammars
v1v2v3 (latest)

Modeling Human Sentence Processing with Left-Corner Recurrent Neural Network Grammars

Conference on Empirical Methods in Natural Language Processing (EMNLP), 2021
10 September 2021
Ryo Yoshida
Hiroshi Noji
Yohei Oseki
ArXiv (abs)PDFHTMLGithub (11782★)

Papers citing "Modeling Human Sentence Processing with Left-Corner Recurrent Neural Network Grammars"

5 / 5 papers shown
Large Language Models Are Human-Like Internally
Large Language Models Are Human-Like Internally
Tatsuki Kuribayashi
Yohei Oseki
Souhaib Ben Taieb
Kentaro Inui
Timothy Baldwin
722
24
0
03 Feb 2025
Emergent Word Order Universals from Cognitively-Motivated Language
  Models
Emergent Word Order Universals from Cognitively-Motivated Language Models
Tatsuki Kuribayashi
Ryo Ueda
Ryosuke Yoshida
Yohei Oseki
Ted Briscoe
Timothy Baldwin
374
10
0
19 Feb 2024
Psychometric Predictive Power of Large Language Models
Psychometric Predictive Power of Large Language Models
Tatsuki Kuribayashi
Yohei Oseki
Timothy Baldwin
LM&MA
321
8
0
13 Nov 2023
Composition, Attention, or Both?
Composition, Attention, or Both?Conference on Empirical Methods in Natural Language Processing (EMNLP), 2022
Ryosuke Yoshida
Yohei Oseki
CoGe
211
0
0
24 Oct 2022
Context Limitations Make Neural Language Models More Human-Like
Context Limitations Make Neural Language Models More Human-LikeConference on Empirical Methods in Natural Language Processing (EMNLP), 2022
Tatsuki Kuribayashi
Yohei Oseki
Ana Brassard
Kentaro Inui
509
50
0
23 May 2022
1
Page 1 of 1