ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2111.03108
  4. Cited By
How Do Neural Sequence Models Generalize? Local and Global Context Cues
  for Out-of-Distribution Prediction

How Do Neural Sequence Models Generalize? Local and Global Context Cues for Out-of-Distribution Prediction

4 November 2021
Anthony Bau
Jacob Andreas
ArXivPDFHTML

Papers citing "How Do Neural Sequence Models Generalize? Local and Global Context Cues for Out-of-Distribution Prediction"

2 / 2 papers shown
Title
State-of-the-art generalisation research in NLP: A taxonomy and review
State-of-the-art generalisation research in NLP: A taxonomy and review
Dieuwke Hupkes
Mario Giulianelli
Verna Dankers
Mikel Artetxe
Yanai Elazar
...
Leila Khalatbari
Maria Ryskina
Rita Frieske
Ryan Cotterell
Zhijing Jin
114
93
0
06 Oct 2022
The Fine Line between Linguistic Generalization and Failure in
  Seq2Seq-Attention Models
The Fine Line between Linguistic Generalization and Failure in Seq2Seq-Attention Models
Noah Weber
L. Shekhar
Niranjan Balasubramanian
98
30
0
03 May 2018
1