ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2004.14620
  4. Cited By
Universal Dependencies according to BERT: both more specific and more
  general
v1v2v3 (latest)

Universal Dependencies according to BERT: both more specific and more general

Findings (Findings), 2020
30 April 2020
Tomasz Limisiewicz
Rudolf Rosa
David Marevcek
ArXiv (abs)PDFHTML

Papers citing "Universal Dependencies according to BERT: both more specific and more general"

11 / 11 papers shown
Linguistic Interpretability of Transformer-based Language Models: a systematic review
Linguistic Interpretability of Transformer-based Language Models: a systematic review
Miguel López-Otal
Jorge Gracia
Jordi Bernad
Carlos Bobed
Lucía Pitarch-Ballesteros
Emma Anglés-Herrero
VLM
482
9
0
09 Apr 2025
Inducing Systematicity in Transformers by Attending to Structurally
  Quantized Embeddings
Inducing Systematicity in Transformers by Attending to Structurally Quantized EmbeddingsAnnual Meeting of the Association for Computational Linguistics (ACL), 2024
Yichen Jiang
Xiang Zhou
Mohit Bansal
327
1
0
09 Feb 2024
Dynamic Syntax Mapping: A New Approach to Unsupervised Syntax Parsing
Dynamic Syntax Mapping: A New Approach to Unsupervised Syntax Parsing
Buvarp Gohsh
Woods Ali
Michael Anders
289
0
0
18 Dec 2023
Mini-Model Adaptation: Efficiently Extending Pretrained Models to New
  Languages via Aligned Shallow Training
Mini-Model Adaptation: Efficiently Extending Pretrained Models to New Languages via Aligned Shallow TrainingAnnual Meeting of the Association for Computational Linguistics (ACL), 2022
Kelly Marchisio
Patrick Lewis
Yihong Chen
Mikel Artetxe
322
28
0
20 Dec 2022
Syntactic Substitutability as Unsupervised Dependency Syntax
Syntactic Substitutability as Unsupervised Dependency SyntaxConference on Empirical Methods in Natural Language Processing (EMNLP), 2022
Jasper Jian
Siva Reddy
268
5
0
29 Nov 2022
State-of-the-art generalisation research in NLP: A taxonomy and review
State-of-the-art generalisation research in NLP: A taxonomy and reviewNature Machine Intelligence (Nat. Mach. Intell.), 2022
Dieuwke Hupkes
Mario Giulianelli
Verna Dankers
Mikel Artetxe
Yanai Elazar
...
Leila Khalatbari
Maria Ryskina
Rita Frieske
Robert Bamler
Zhijing Jin
719
140
0
06 Oct 2022
Multilingual Transformer Encoders: a Word-Level Task-Agnostic Evaluation
Multilingual Transformer Encoders: a Word-Level Task-Agnostic EvaluationIEEE International Joint Conference on Neural Network (IJCNN), 2022
Félix Gaschi
François Plesse
Parisa Rastin
Y. Toussaint
250
9
0
19 Jul 2022
A Primer on Pretrained Multilingual Language Models
A Primer on Pretrained Multilingual Language Models
Sumanth Doddapaneni
Gowtham Ramesh
Mitesh M. Khapra
Anoop Kunchukuttan
Pratyush Kumar
LRM
258
87
0
01 Jul 2021
On the Evolution of Syntactic Information Encoded by BERT's
  Contextualized Representations
On the Evolution of Syntactic Information Encoded by BERT's Contextualized RepresentationsConference of the European Chapter of the Association for Computational Linguistics (EACL), 2021
Laura Pérez-Mayos
Roberto Carlini
Miguel Ballesteros
Leo Wanner
292
11
0
27 Jan 2021
Attention Can Reflect Syntactic Structure (If You Let It)
Attention Can Reflect Syntactic Structure (If You Let It)Conference of the European Chapter of the Association for Computational Linguistics (EACL), 2021
Vinit Ravishankar
Artur Kulmizev
Mostafa Abdou
Anders Søgaard
Joakim Nivre
152
40
0
26 Jan 2021
Syntax Representation in Word Embeddings and Neural Networks -- A Survey
Syntax Representation in Word Embeddings and Neural Networks -- A SurveyConference on Theory and Practice of Information Technologies (TPIT), 2020
Tomasz Limisiewicz
David Marecek
NAI
258
9
0
02 Oct 2020
1
Page 1 of 1