ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1906.01280
  4. Cited By
Are we there yet? Encoder-decoder neural networks as cognitive models of
  English past tense inflection

Are we there yet? Encoder-decoder neural networks as cognitive models of English past tense inflection

Annual Meeting of the Association for Computational Linguistics (ACL), 2019
4 June 2019
M. Corkery
Yevgen Matusevych
Sharon Goldwater
ArXiv (abs)PDFHTML

Papers citing "Are we there yet? Encoder-decoder neural networks as cognitive models of English past tense inflection"

15 / 15 papers shown
Evaluating the cognitive reality of Spanish irregular morphomic patterns: Humans vs. Transformers
Evaluating the cognitive reality of Spanish irregular morphomic patterns: Humans vs. Transformers
Akhilesh Kakolu Ramarao
Kevin Tang
Dinah Baer-Henney
188
0
0
29 Jul 2025
Is deeper always better? Replacing linear mappings with deep learning networks in the Discriminative Lexicon Model
Is deeper always better? Replacing linear mappings with deep learning networks in the Discriminative Lexicon ModelLinguistics Vanguard (LV), 2024
Maria Heitmeier
Valeria Schmidt
Hendrik P. A. Lensch
R. Baayen
463
2
0
05 Oct 2024
LLMs' morphological analyses of complex FST-generated Finnish words
LLMs' morphological analyses of complex FST-generated Finnish words
Anssi Moisio
Mathias Creutz
M. Kurimo
313
2
0
11 Jul 2024
Why Linguistics Will Thrive in the 21st Century: A Reply to Piantadosi
  (2023)
Why Linguistics Will Thrive in the 21st Century: A Reply to Piantadosi (2023)
Jordan Kodner
Sarah Payne
Jeffrey Heinz
LRM
255
17
0
06 Aug 2023
Evaluating Transformer Models and Human Behaviors on Chinese Character
  Naming
Evaluating Transformer Models and Human Behaviors on Chinese Character NamingTransactions of the Association for Computational Linguistics (TACL), 2023
Xiaomeng Ma
Lingyu Gao
258
0
0
22 Mar 2023
A Comprehensive Comparison of Neural Networks as Cognitive Models of
  Inflection
A Comprehensive Comparison of Neural Networks as Cognitive Models of InflectionConference on Empirical Methods in Natural Language Processing (EMNLP), 2022
Adam Wiemerslage
Shiran Dudy
Katharina Kann
246
5
0
22 Oct 2022
How do we get there? Evaluating transformer neural networks as cognitive
  models for English past tense inflection
How do we get there? Evaluating transformer neural networks as cognitive models for English past tense inflection
Xiaomeng Ma
Lingyu Gao
260
4
0
17 Oct 2022
State-of-the-art generalisation research in NLP: A taxonomy and review
State-of-the-art generalisation research in NLP: A taxonomy and reviewNature Machine Intelligence (Nat. Mach. Intell.), 2022
Dieuwke Hupkes
Mario Giulianelli
Verna Dankers
Mikel Artetxe
Yanai Elazar
...
Leila Khalatbari
Maria Ryskina
Rita Frieske
Robert Bamler
Zhijing Jin
698
139
0
06 Oct 2022
Did AI get more negative recently?
Did AI get more negative recently?Royal Society Open Science (RSOS), 2022
Dominik Beese
Begüm Altunbaş
Görkem Güzeler
Steffen Eger
AILaw
405
4
0
28 Feb 2022
Not quite there yet: Combining analogical patterns and encoder-decoder
  networks for cognitively plausible inflection
Not quite there yet: Combining analogical patterns and encoder-decoder networks for cognitively plausible inflection
Basilio Calderone
Nabil Hathout
Olivier Bonami
131
5
0
09 Aug 2021
Computational Morphology with Neural Network Approaches
Computational Morphology with Neural Network Approaches
Ling Liu
AI4CE
260
11
0
19 May 2021
The Greedy and Recursive Search for Morphological Productivity
The Greedy and Recursive Search for Morphological ProductivityAnnual Meeting of the Cognitive Science Society (CogSci), 2021
Caleb Belth
Sarah Payne
Deniz Beser
Jordan Kodner
Charles D. Yang
255
17
0
12 May 2021
Falling Through the Gaps: Neural Architectures as Models of
  Morphological Rule Learning
Falling Through the Gaps: Neural Architectures as Models of Morphological Rule LearningAnnual Meeting of the Cognitive Science Society (CogSci), 2021
Deniz Beser
AI4CE
340
6
0
08 May 2021
Inflecting when there's no majority: Limitations of encoder-decoder
  neural networks as cognitive models for German plurals
Inflecting when there's no majority: Limitations of encoder-decoder neural networks as cognitive models for German plurals
Kate McCurdy
Sharon Goldwater
Adam Lopez
299
35
0
18 May 2020
The Paradigm Discovery Problem
The Paradigm Discovery ProblemAnnual Meeting of the Association for Computational Linguistics (ACL), 2020
Alexander Erdmann
Micha Elsner
Shijie Wu
Robert Bamler
Farah E. Shamout
186
21
0
04 May 2020
1
Page 1 of 1