ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2406.04289
  4. Cited By
What Languages are Easy to Language-Model? A Perspective from Learning Probabilistic Regular Languages

What Languages are Easy to Language-Model? A Perspective from Learning Probabilistic Regular Languages

6 June 2024
Nadav Borenstein
Anej Svete
R. Chan
Josef Valvoda
Franz Nowak
Isabelle Augenstein
Eleanor Chodroff
Ryan Cotterell
ArXivPDFHTML

Papers citing "What Languages are Easy to Language-Model? A Perspective from Learning Probabilistic Regular Languages"

11 / 11 papers shown
Title
Better Estimation of the KL Divergence Between Language Models
Better Estimation of the KL Divergence Between Language Models
Afra Amini
Tim Vieira
Ryan Cotterell
41
0
0
14 Apr 2025
Anything Goes? A Crosslinguistic Study of (Im)possible Language Learning in LMs
Anything Goes? A Crosslinguistic Study of (Im)possible Language Learning in LMs
Xiulin Yang
Tatsuya Aoyama
Yuekun Yao
Ethan Wilcox
40
1
0
26 Feb 2025
Can Language Models Learn Typologically Implausible Languages?
Can Language Models Learn Typologically Implausible Languages?
Tianyang Xu
Tatsuki Kuribayashi
Yohei Oseki
Ryan Cotterell
Alex Warstadt
65
1
0
17 Feb 2025
Training Bilingual LMs with Data Constraints in the Targeted Language
Training Bilingual LMs with Data Constraints in the Targeted Language
Skyler Seto
Maartje ter Hoeve
He Bai
Natalie Schluter
David Grangier
71
0
0
20 Nov 2024
Training Neural Networks as Recognizers of Formal Languages
Training Neural Networks as Recognizers of Formal Languages
Alexandra Butoi
Ghazal Khalighinejad
Anej Svete
Josef Valvoda
Ryan Cotterell
Brian DuSell
NAI
28
2
0
11 Nov 2024
Can Transformers Learn $n$-gram Language Models?
Can Transformers Learn nnn-gram Language Models?
Anej Svete
Nadav Borenstein
M. Zhou
Isabelle Augenstein
Ryan Cotterell
27
6
0
03 Oct 2024
Lower Bounds on the Expressivity of Recurrent Neural Language Models
Lower Bounds on the Expressivity of Recurrent Neural Language Models
Anej Svete
Franz Nowak
Anisha Mohamed Sahabdeen
Ryan Cotterell
27
0
0
29 May 2024
On Efficiently Representing Regular Languages as RNNs
On Efficiently Representing Regular Languages as RNNs
Anej Svete
R. Chan
Ryan Cotterell
28
1
0
24 Feb 2024
OLMo: Accelerating the Science of Language Models
OLMo: Accelerating the Science of Language Models
Dirk Groeneveld
Iz Beltagy
Pete Walsh
Akshita Bhagia
Rodney Michael Kinney
...
Jesse Dodge
Kyle Lo
Luca Soldaini
Noah A. Smith
Hanna Hajishirzi
OSLM
124
349
0
01 Feb 2024
Transparency at the Source: Evaluating and Interpreting Language Models
  With Access to the True Distribution
Transparency at the Source: Evaluating and Interpreting Language Models With Access to the True Distribution
Jaap Jumelet
Willem H. Zuidema
24
2
0
23 Oct 2023
Neural Networks and the Chomsky Hierarchy
Neural Networks and the Chomsky Hierarchy
Grégoire Delétang
Anian Ruoss
Jordi Grau-Moya
Tim Genewein
L. Wenliang
...
Chris Cundy
Marcus Hutter
Shane Legg
Joel Veness
Pedro A. Ortega
UQCV
89
129
0
05 Jul 2022
1