ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1706.03542
  4. Cited By
Exploring the Syntactic Abilities of RNNs with Multi-task Learning

Exploring the Syntactic Abilities of RNNs with Multi-task Learning

12 June 2017
Émile Enguehard
Yoav Goldberg
Tal Linzen
ArXiv (abs)PDFHTML

Papers citing "Exploring the Syntactic Abilities of RNNs with Multi-task Learning"

19 / 19 papers shown
Title
A Language Model with Limited Memory Capacity Captures Interference in
  Human Sentence Processing
A Language Model with Limited Memory Capacity Captures Interference in Human Sentence Processing
William Timkey
Tal Linzen
34
17
0
24 Oct 2023
Syntactic Surprisal From Neural Models Predicts, But Underestimates,
  Human Processing Difficulty From Syntactic Ambiguities
Syntactic Surprisal From Neural Models Predicts, But Underestimates, Human Processing Difficulty From Syntactic Ambiguities
Suhas Arehalli
Brian Dillon
Tal Linzen
98
42
0
21 Oct 2022
Discourse structure interacts with reference but not syntax in neural
  language models
Discourse structure interacts with reference but not syntax in neural language models
Forrest Davis
Marten van Schijndel
54
20
0
10 Oct 2020
Encodings of Source Syntax: Similarities in NMT Representations Across
  Target Languages
Encodings of Source Syntax: Similarities in NMT Representations Across Target Languages
Tyler A. Chang
Anna N. Rafferty
59
2
0
17 May 2020
Recurrent Neural Network Language Models Always Learn English-Like
  Relative Clause Attachment
Recurrent Neural Network Language Models Always Learn English-Like Relative Clause Attachment
Forrest Davis
Marten van Schijndel
52
23
0
01 May 2020
An Analysis of the Utility of Explicit Negative Examples to Improve the
  Syntactic Abilities of Neural Language Models
An Analysis of the Utility of Explicit Negative Examples to Improve the Syntactic Abilities of Neural Language Models
Hiroshi Noji
Hiroya Takamura
59
14
0
06 Apr 2020
Assessing the Memory Ability of Recurrent Neural Networks
Assessing the Memory Ability of Recurrent Neural Networks
Cheng Zhang
Qiuchi Li
L. Hua
D. Song
24
6
0
18 Feb 2020
Does syntax need to grow on trees? Sources of hierarchical inductive
  bias in sequence-to-sequence networks
Does syntax need to grow on trees? Sources of hierarchical inductive bias in sequence-to-sequence networks
R. Thomas McCoy
Robert Frank
Tal Linzen
91
109
0
10 Jan 2020
Using Priming to Uncover the Organization of Syntactic Representations
  in Neural Language Models
Using Priming to Uncover the Organization of Syntactic Representations in Neural Language Models
Grusha Prasad
Marten van Schijndel
Tal Linzen
79
52
0
23 Sep 2019
Quantity doesn't buy quality syntax with neural language models
Quantity doesn't buy quality syntax with neural language models
Marten van Schijndel
Aaron Mueller
Tal Linzen
84
69
0
31 Aug 2019
What Should/Do/Can LSTMs Learn When Parsing Auxiliary Verb
  Constructions?
What Should/Do/Can LSTMs Learn When Parsing Auxiliary Verb Constructions?
Miryam de Lhoneux
Sara Stymne
Joakim Nivre
34
3
0
18 Jul 2019
Scalable Syntax-Aware Language Models Using Knowledge Distillation
Scalable Syntax-Aware Language Models Using Knowledge Distillation
A. Kuncoro
Chris Dyer
Laura Rimell
S. Clark
Phil Blunsom
138
26
0
14 Jun 2019
Studying the Inductive Biases of RNNs with Synthetic Variations of
  Natural Languages
Studying the Inductive Biases of RNNs with Synthetic Variations of Natural Languages
Shauli Ravfogel
Yoav Goldberg
Tal Linzen
81
71
0
15 Mar 2019
Neural Language Models as Psycholinguistic Subjects: Representations of
  Syntactic State
Neural Language Models as Psycholinguistic Subjects: Representations of Syntactic State
Richard Futrell
Ethan Gotlieb Wilcox
Takashi Morita
Peng Qian
Miguel Ballesteros
R. Levy
MILM
128
196
0
08 Mar 2019
RNNs as psycholinguistic subjects: Syntactic state and grammatical
  dependency
RNNs as psycholinguistic subjects: Syntactic state and grammatical dependency
Richard Futrell
Ethan Gotlieb Wilcox
Takashi Morita
R. Levy
86
58
0
05 Sep 2018
Targeted Syntactic Evaluation of Language Models
Targeted Syntactic Evaluation of Language Models
Rebecca Marvin
Tal Linzen
94
417
0
27 Aug 2018
Distinct patterns of syntactic agreement errors in recurrent networks
  and humans
Distinct patterns of syntactic agreement errors in recurrent networks and humans
Tal Linzen
Brian Leonard
65
46
0
18 Jul 2018
Are All Languages Equally Hard to Language-Model?
Are All Languages Equally Hard to Language-Model?
Ryan Cotterell
Sabrina J. Mielke
Jason Eisner
Brian Roark
84
97
0
10 Jun 2018
Colorless green recurrent networks dream hierarchically
Colorless green recurrent networks dream hierarchically
Kristina Gulordava
Piotr Bojanowski
Edouard Grave
Tal Linzen
Marco Baroni
110
505
0
29 Mar 2018
1