ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.17308
36
0

Implicit Word Reordering with Knowledge Distillation for Cross-Lingual Dependency Parsing

24 February 2025
Zhuoran Li
Chunming Hu
J. Chen
Zhijun Chen
Richong Zhang
ArXivPDFHTML
Abstract

Word order difference between source and target languages is a major obstacle to cross-lingual transfer, especially in the dependency parsing task. Current works are mostly based on order-agnostic models or word reordering to mitigate this problem. However, such methods either do not leverage grammatical information naturally contained in word order or are computationally expensive as the permutation space grows exponentially with the sentence length. Moreover, the reordered source sentence with an unnatural word order may be a form of noising that harms the model learning. To this end, we propose an Implicit Word Reordering framework with Knowledge Distillation (IWR-KD). This framework is inspired by that deep networks are good at learning feature linearization corresponding to meaningful data transformation, e.g. word reordering. To realize this idea, we introduce a knowledge distillation framework composed of a word-reordering teacher model and a dependency parsing student model. We verify our proposed method on Universal Dependency Treebanks across 31 different languages and show it outperforms a series of competitors, together with experimental analysis to illustrate how our method works towards training a robust parser.

View on arXiv
@article{li2025_2502.17308,
  title={ Implicit Word Reordering with Knowledge Distillation for Cross-Lingual Dependency Parsing },
  author={ Zhuoran Li and Chunming Hu and Junfan Chen and Zhijun Chen and Richong Zhang },
  journal={arXiv preprint arXiv:2502.17308},
  year={ 2025 }
}
Comments on this paper