ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2005.00165
  4. Cited By
Recurrent Neural Network Language Models Always Learn English-Like
  Relative Clause Attachment

Recurrent Neural Network Language Models Always Learn English-Like Relative Clause Attachment

1 May 2020
Forrest Davis
Marten van Schijndel
ArXivPDFHTML

Papers citing "Recurrent Neural Network Language Models Always Learn English-Like Relative Clause Attachment"

4 / 4 papers shown
Title
Who Relies More on World Knowledge and Bias for Syntactic Ambiguity Resolution: Humans or LLMs?
Who Relies More on World Knowledge and Bias for Syntactic Ambiguity Resolution: Humans or LLMs?
So Young Lee
Russell Scheinberg
Amber Shore
Ameeta Agrawal
51
1
0
13 Mar 2025
Optimal word order for non-causal text generation with Large Language Models: the Spanish case
Optimal word order for non-causal text generation with Large Language Models: the Spanish case
Andrea Busto-Castiñeira
Silvia García-Méndez
Francisco de Arriba-Pérez
Francisco J. González Castaño
41
0
0
21 Feb 2025
Controlled Evaluation of Grammatical Knowledge in Mandarin Chinese
  Language Models
Controlled Evaluation of Grammatical Knowledge in Mandarin Chinese Language Models
Yiwen Wang
Jennifer Hu
R. Levy
Peng Qian
15
3
0
22 Sep 2021
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
299
6,984
0
20 Apr 2018
1