ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1808.09060
  4. Cited By
An Investigation of the Interactions Between Pre-Trained Word
  Embeddings, Character Models and POS Tags in Dependency Parsing

An Investigation of the Interactions Between Pre-Trained Word Embeddings, Character Models and POS Tags in Dependency Parsing

27 August 2018
Aaron Smith
Miryam de Lhoneux
Sara Stymne
Joakim Nivre
ArXivPDFHTML

Papers citing "An Investigation of the Interactions Between Pre-Trained Word Embeddings, Character Models and POS Tags in Dependency Parsing"

13 / 13 papers shown
Title
Another Dead End for Morphological Tags? Perturbed Inputs and Parsing
Another Dead End for Morphological Tags? Perturbed Inputs and Parsing
Alberto Muñoz-Ortiz
David Vilares
38
1
0
24 May 2023
Parsing linearizations appreciate PoS tags - but some are fussy about
  errors
Parsing linearizations appreciate PoS tags - but some are fussy about errors
Alberto Muñoz-Ortiz
Mark Anderson
David Vilares
Carlos Gómez-Rodríguez
35
2
0
27 Oct 2022
The Fragility of Multi-Treebank Parsing Evaluation
The Fragility of Multi-Treebank Parsing Evaluation
I. Alonso-Alonso
David Vilares
Carlos Gómez-Rodríguez
22
1
0
14 Sep 2022
Sort by Structure: Language Model Ranking as Dependency Probing
Sort by Structure: Language Model Ranking as Dependency Probing
Max Müller-Eberstein
Rob van der Goot
Barbara Plank
41
3
0
10 Jun 2022
Instance-Based Neural Dependency Parsing
Instance-Based Neural Dependency Parsing
Hiroki Ouchi
Jun Suzuki
Sosuke Kobayashi
Sho Yokoi
Tatsuki Kuribayashi
Masashi Yoshikawa
Kentaro Inui
37
3
0
28 Sep 2021
A Modest Pareto Optimisation Analysis of Dependency Parsers in 2021
A Modest Pareto Optimisation Analysis of Dependency Parsers in 2021
Mark Anderson
Carlos Gómez-Rodríguez
41
9
0
08 Jun 2021
What Taggers Fail to Learn, Parsers Need the Most
What Taggers Fail to Learn, Parsers Need the Most
Mark Anderson
Carlos Gómez-Rodríguez
19
2
0
02 Apr 2021
Recursive Non-Autoregressive Graph-to-Graph Transformer for Dependency
  Parsing with Iterative Refinement
Recursive Non-Autoregressive Graph-to-Graph Transformer for Dependency Parsing with Iterative Refinement
Alireza Mohammadshahi
James Henderson
35
33
0
29 Mar 2020
Deep Contextualized Word Embeddings in Transition-Based and Graph-Based
  Dependency Parsing -- A Tale of Two Parsers Revisited
Deep Contextualized Word Embeddings in Transition-Based and Graph-Based Dependency Parsing -- A Tale of Two Parsers Revisited
Artur Kulmizev
Miryam de Lhoneux
Johannes Gontrum
Elena Fano
Joakim Nivre
38
56
0
20 Aug 2019
Better, Faster, Stronger Sequence Tagging Constituent Parsers
Better, Faster, Stronger Sequence Tagging Constituent Parsers
David Vilares
Mostafa Abdou
Anders Søgaard
33
22
0
28 Feb 2019
Recursive Subtree Composition in LSTM-Based Dependency Parsing
Recursive Subtree Composition in LSTM-Based Dependency Parsing
Miryam de Lhoneux
Miguel Ballesteros
Joakim Nivre
14
13
0
26 Feb 2019
Cross-Lingual Alignment of Contextual Word Embeddings, with Applications
  to Zero-shot Dependency Parsing
Cross-Lingual Alignment of Contextual Word Embeddings, with Applications to Zero-shot Dependency Parsing
Tal Schuster
Ori Ram
Regina Barzilay
Amir Globerson
39
208
0
25 Feb 2019
82 Treebanks, 34 Models: Universal Dependency Parsing with
  Multi-Treebank Models
82 Treebanks, 34 Models: Universal Dependency Parsing with Multi-Treebank Models
Aaron Smith
Bernd Bohnet
Miryam de Lhoneux
Joakim Nivre
Yan Shao
Sara Stymne
30
67
0
06 Sep 2018
1