ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2105.04688
  4. Cited By
Assessing the Syntactic Capabilities of Transformer-based Multilingual
  Language Models

Assessing the Syntactic Capabilities of Transformer-based Multilingual Language Models

Findings (Findings), 2021
10 May 2021
Laura Pérez-Mayos
Alba Táboas García
Simon Mille
Leo Wanner
    ELMLRM
ArXiv (abs)PDFHTML

Papers citing "Assessing the Syntactic Capabilities of Transformer-based Multilingual Language Models"

1 / 1 papers shown
A computational psycholinguistic evaluation of the syntactic abilities
  of Galician BERT models at the interface of dependency resolution and
  training time
A computational psycholinguistic evaluation of the syntactic abilities of Galician BERT models at the interface of dependency resolution and training time
Iria de-Dios-Flores
Marcos Garcia
251
4
0
06 Jun 2022
1
Page 1 of 1