89
0

KG-BiLM: Knowledge Graph Embedding via Bidirectional Language Models

Main:9 Pages
5 Figures
Bibliography:5 Pages
8 Tables
Appendix:17 Pages
Abstract

Recent advances in knowledge representation learning (KRL) highlight the urgent necessity to unify symbolic knowledge graphs (KGs) with language models (LMs) for richer semantic understanding. However, existing approaches typically prioritize either graph structure or textual semantics, leaving a gap: a unified framework that simultaneously captures global KG connectivity, nuanced linguistic context, and discriminative reasoning semantics. To bridge this gap, we introduce KG-BiLM, a bidirectional LM framework that fuses structural cues from KGs with the semantic expressiveness of generative transformers. KG-BiLM incorporates three key components: (i) Bidirectional Knowledge Attention, which removes the causal mask to enable full interaction among all tokens and entities; (ii) Knowledge-Masked Prediction, which encourages the model to leverage both local semantic contexts and global graph connectivity; and (iii) Contrastive Graph Semantic Aggregation, which preserves KG structure via contrastive alignment of sampled sub-graph representations. Extensive experiments on standard benchmarks demonstrate that KG-BiLM outperforms strong baselines in link prediction, especially on large-scale graphs with complex multi-hop relations - validating its effectiveness in unifying structural information and textual semantics.

View on arXiv
@article{chen2025_2506.03576,
  title={ KG-BiLM: Knowledge Graph Embedding via Bidirectional Language Models },
  author={ Zirui Chen and Xin Wang and Zhao Li and Wenbin Guo and Dongxiao He },
  journal={arXiv preprint arXiv:2506.03576},
  year={ 2025 }
}
Comments on this paper