ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1909.03794
20
10

Composing Knowledge Graph Embeddings via Word Embeddings

9 September 2019
Lianbo Ma
Peng Sun
Zhiwei Lin
Hui Wang
    CoGe
ArXivPDFHTML
Abstract

Learning knowledge graph embedding from an existing knowledge graph is very important to knowledge graph completion. For a fact (h,r,t)(h,r,t)(h,r,t) with the head entity hhh having a relation rrr with the tail entity ttt, the current approaches aim to learn low dimensional representations (h,r,t)(\mathbf{h},\mathbf{r},\mathbf{t})(h,r,t), each of which corresponds to the elements in (h,r,t)(h, r, t)(h,r,t), respectively. As (h,r,t)(\mathbf{h},\mathbf{r},\mathbf{t})(h,r,t) is learned from the existing facts within a knowledge graph, these representations can not be used to detect unknown facts (if the entities or relations never occur in the knowledge graph). This paper proposes a new approach called TransW, aiming to go beyond the current work by composing knowledge graph embeddings using word embeddings. Given the fact that an entity or a relation contains one or more words (quite often), it is sensible to learn a mapping function from word embedding spaces to knowledge embedding spaces, which shows how entities are constructed using human words. More importantly, composing knowledge embeddings using word embeddings makes it possible to deal with the emerging new facts (either new entities or relations). Experimental results using three public datasets show the consistency and outperformance of the proposed TransW.

View on arXiv
Comments on this paper