ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2009.13964
  4. Cited By
CokeBERT: Contextual Knowledge Selection and Embedding towards Enhanced
  Pre-Trained Language Models

CokeBERT: Contextual Knowledge Selection and Embedding towards Enhanced Pre-Trained Language Models

29 September 2020
Yusheng Su
Xu Han
Zhengyan Zhang
Peng Li
Zhiyuan Liu
Yankai Lin
Jie Zhou
Maosong Sun
    ODL
ArXivPDFHTML

Papers citing "CokeBERT: Contextual Knowledge Selection and Embedding towards Enhanced Pre-Trained Language Models"

4 / 4 papers shown
Title
Large Language Models Can Better Understand Knowledge Graphs Than We Thought
Large Language Models Can Better Understand Knowledge Graphs Than We Thought
Xinbang Dai
Yuncheng Hua
Tongtong Wu
Yang Sheng
Qiu Ji
Guilin Qi
78
0
0
18 Feb 2024
Knowledge Perceived Multi-modal Pretraining in E-commerce
Knowledge Perceived Multi-modal Pretraining in E-commerce
Yushan Zhu
Huaixiao Tou
Wen Zhang
Ganqiang Ye
Hui Chen
Ningyu Zhang
Huajun Chen
18
32
0
20 Aug 2021
K-BERT: Enabling Language Representation with Knowledge Graph
K-BERT: Enabling Language Representation with Knowledge Graph
Weijie Liu
Peng Zhou
Zhe Zhao
Zhiruo Wang
Qi Ju
Haotang Deng
Ping Wang
229
778
0
17 Sep 2019
Knowledge Enhanced Contextual Word Representations
Knowledge Enhanced Contextual Word Representations
Matthew E. Peters
Mark Neumann
IV RobertL.Logan
Roy Schwartz
Vidur Joshi
Sameer Singh
Noah A. Smith
226
656
0
09 Sep 2019
1