Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2109.01048
Cited By
Pre-training Language Model Incorporating Domain-specific Heterogeneous Knowledge into A Unified Representation
2 September 2021
Hongyin Zhu
Hao Peng
Zhiheng Lyu
Lei Hou
Juan-Zi Li
Jinghui Xiao
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Pre-training Language Model Incorporating Domain-specific Heterogeneous Knowledge into A Unified Representation"
5 / 5 papers shown
Title
MEG: Medical Knowledge-Augmented Large Language Models for Question Answering
Laura Cabello
Carmen Martin-Turrero
Uchenna Akujuobi
Anders Søgaard
Carlos Bobed
AI4MH
107
1
0
06 Nov 2024
HippoRAG: Neurobiologically Inspired Long-Term Memory for Large Language Models
Bernal Jiménez Gutiérrez
Yiheng Shu
Yu Gu
Michihiro Yasunaga
Yu-Chuan Su
RALM
CLL
60
33
0
23 May 2024
K-12BERT: BERT for K-12 education
Vasu Goel
Dhruv Sahnan
Venktesh V
Gaurav Sharma
Deep Dwivedi
Mukesh Mohania
AI4Ed
14
5
0
24 May 2022
K-BERT: Enabling Language Representation with Knowledge Graph
Weijie Liu
Peng Zhou
Zhe Zhao
Zhiruo Wang
Qi Ju
Haotang Deng
Ping Wang
229
778
0
17 Sep 2019
Language Models as Knowledge Bases?
Fabio Petroni
Tim Rocktaschel
Patrick Lewis
A. Bakhtin
Yuxiang Wu
Alexander H. Miller
Sebastian Riedel
KELM
AI4MH
408
2,584
0
03 Sep 2019
1