Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2009.13964
Cited By
CokeBERT: Contextual Knowledge Selection and Embedding towards Enhanced Pre-Trained Language Models
29 September 2020
Yusheng Su
Xu Han
Zhengyan Zhang
Peng Li
Zhiyuan Liu
Yankai Lin
Jie Zhou
Maosong Sun
ODL
Re-assign community
ArXiv
PDF
HTML
Papers citing
"CokeBERT: Contextual Knowledge Selection and Embedding towards Enhanced Pre-Trained Language Models"
4 / 4 papers shown
Title
Large Language Models Can Better Understand Knowledge Graphs Than We Thought
Xinbang Dai
Yuncheng Hua
Tongtong Wu
Yang Sheng
Qiu Ji
Guilin Qi
78
0
0
18 Feb 2024
Knowledge Perceived Multi-modal Pretraining in E-commerce
Yushan Zhu
Huaixiao Tou
Wen Zhang
Ganqiang Ye
Hui Chen
Ningyu Zhang
Huajun Chen
18
32
0
20 Aug 2021
K-BERT: Enabling Language Representation with Knowledge Graph
Weijie Liu
Peng Zhou
Zhe Zhao
Zhiruo Wang
Qi Ju
Haotang Deng
Ping Wang
229
778
0
17 Sep 2019
Knowledge Enhanced Contextual Word Representations
Matthew E. Peters
Mark Neumann
IV RobertL.Logan
Roy Schwartz
Vidur Joshi
Sameer Singh
Noah A. Smith
226
656
0
09 Sep 2019
1