ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1910.13105
55
29
v1v2 (latest)

RAKA:Co-training of Relationships and Attributes for Cross-lingual Knowledge Alignment

29 October 2019
Bo Chen
Jing Zhang
Xiaobin Tang
Hong Chen
Cuiping Li
ArXiv (abs)PDFHTML
Abstract

Cross-lingual knowledge alignment suffers from the attribute heterogeneity when leveraging the attributes and also suffers from the conflicts when combing the results inferred from attributes and relationships. This paper proposes an interaction based attribute model to capture the attribute-level interactions for estimating entity similarities, eliminating the negative impact of the dissimilar attributes. A matrix-based strategy is adopted in the model to accelerate the similarity estimation. We further propose a co-training framework together with three merge strategies to combine the alignments inferred by the attribute model and the relationship model. The whole framework can effectively and efficiently infer the aligned entities, relationships, attributes, and values simultaneously. Experimental results on several cross-lingual knowledge datasets show that our model significantly outperforms the state-of-the-art comparison methods (improving 2.35-51.57% in terms of Hit Ratio@1).

View on arXiv
Comments on this paper