ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2009.05912
  4. Cited By
DualDE: Dually Distilling Knowledge Graph Embedding for Faster and
  Cheaper Reasoning

DualDE: Dually Distilling Knowledge Graph Embedding for Faster and Cheaper Reasoning

13 September 2020
Yushan Zhu
Wen Zhang
Mingyang Chen
Hui Chen
Xu-Xin Cheng
Wei Zhang
Huajun Chen Zhejiang University
ArXivPDFHTML

Papers citing "DualDE: Dually Distilling Knowledge Graph Embedding for Faster and Cheaper Reasoning"

4 / 4 papers shown
Title
Progressive Distillation Based on Masked Generation Feature Method for
  Knowledge Graph Completion
Progressive Distillation Based on Masked Generation Feature Method for Knowledge Graph Completion
Cunhang Fan
Yujie Chen
Jun Xue
Yonghui Kong
Jianhua Tao
Zhao Lv
20
2
0
19 Jan 2024
Random Entity Quantization for Parameter-Efficient Compositional
  Knowledge Graph Representation
Random Entity Quantization for Parameter-Efficient Compositional Knowledge Graph Representation
Jiaang Li
Quan Wang
Yi Liu
L. Zhang
Zhendong Mao
27
0
0
24 Oct 2023
From Wide to Deep: Dimension Lifting Network for Parameter-efficient
  Knowledge Graph Embedding
From Wide to Deep: Dimension Lifting Network for Parameter-efficient Knowledge Graph Embedding
Borui Cai
Yong Xiang
Longxiang Gao
Di Wu
Heng Zhang
Jiongdao Jin
Tom H. Luan
21
1
0
22 Mar 2023
Graph-based Knowledge Distillation: A survey and experimental evaluation
Graph-based Knowledge Distillation: A survey and experimental evaluation
Jing Liu
Tongya Zheng
Guanzheng Zhang
Qinfen Hao
31
8
0
27 Feb 2023
1