11
0

ProjB: An Improved Bilinear Biased ProjE model for Knowledge Graph Completion

Abstract

Knowledge Graph Embedding (KGE) methods have gained enormous attention from a wide range of AI communities including Natural Language Processing (NLP) for text generation, classification and context induction. Embedding a huge number of inter-relationships in terms of a small number of dimensions, require proper modeling in both cognitive and computational aspects. Recently, numerous objective functions regarding cognitive and computational aspects of natural languages are developed. Among which are the state-of-the-art methods of linearity, bilinearity, manifold-preserving kernels, projection-subspace, and analogical inference. However, the major challenge of such models lies in their loss functions that associate the dimension of relation embeddings to corresponding entity dimension. This leads to inaccurate prediction of corresponding relations among entities when counterparts are estimated wrongly. ProjE KGE, published by Bordes et al., due to low computational complexity and high potential for model improvement, is improved in this work regarding all translative and bilinear interactions while capturing entity nonlinearity. Experimental results on benchmark Knowledge Graphs (KGs) such as FB15K and WN18 show that the proposed approach outperforms the state-of-the-art models in entity prediction task using linear and bilinear methods and other recent powerful ones. In addition, a parallel processing structure is proposed for the model in order to improve the scalability on large KGs. The effects of different adaptive clustering and newly proposed sampling approaches are also explained which prove to be effective in improving the accuracy of knowledge graph completion.

View on arXiv
Comments on this paper