ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2311.09109
  4. Cited By
Does Pre-trained Language Model Actually Infer Unseen Links in Knowledge
  Graph Completion?

Does Pre-trained Language Model Actually Infer Unseen Links in Knowledge Graph Completion?

15 November 2023
Yusuke Sakai
Hidetaka Kamigaito
Katsuhiko Hayashi
Taro Watanabe
ArXivPDFHTML

Papers citing "Does Pre-trained Language Model Actually Infer Unseen Links in Knowledge Graph Completion?"

4 / 4 papers shown
Title
Subsampling for Knowledge Graph Embedding Explained
Subsampling for Knowledge Graph Embedding Explained
Hidetaka Kamigaito
Katsuhiko Hayashi
12
2
0
13 Sep 2022
PyKEEN 1.0: A Python Library for Training and Evaluating Knowledge Graph
  Embeddings
PyKEEN 1.0: A Python Library for Training and Evaluating Knowledge Graph Embeddings
Mehdi Ali
M. Berrendorf
Charles Tapley Hoyt
Laurent Vermue
Sahand Sharifzadeh
Volker Tresp
Jens Lehmann
29
124
0
28 Jul 2020
K-BERT: Enabling Language Representation with Knowledge Graph
K-BERT: Enabling Language Representation with Knowledge Graph
Weijie Liu
Peng Zhou
Zhe Zhao
Zhiruo Wang
Qi Ju
Haotang Deng
Ping Wang
218
773
0
17 Sep 2019
Language Models as Knowledge Bases?
Language Models as Knowledge Bases?
Fabio Petroni
Tim Rocktaschel
Patrick Lewis
A. Bakhtin
Yuxiang Wu
Alexander H. Miller
Sebastian Riedel
KELM
AI4MH
393
2,216
0
03 Sep 2019
1