ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.14755
49
0

Language Independent Named Entity Recognition via Orthogonal Transformation of Word Vectors

18 March 2025
Omar E. Rakha
Hazem M. Abbas
ArXivPDFHTML
Abstract

Word embeddings have been a key building block for NLP in which models relied heavily on word embeddings in many different tasks. In this paper, a model is proposed based on using Bidirectional LSTM/CRF with word embeddings to perform named entity recognition for any language. This is done by training a model on a source language (English) and transforming word embeddings from the target language into word embeddings of the source language by using an orthogonal linear transformation matrix. Evaluation of the model shows that by training a model on an English dataset the model was capable of detecting named entities in an Arabic dataset without neither training or fine tuning the model on an Arabic language dataset.

View on arXiv
@article{rakha2025_2503.14755,
  title={ Language Independent Named Entity Recognition via Orthogonal Transformation of Word Vectors },
  author={ Omar E. Rakha and Hazem M. Abbas },
  journal={arXiv preprint arXiv:2503.14755},
  year={ 2025 }
}
Comments on this paper