Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2009.05912
Cited By
DualDE: Dually Distilling Knowledge Graph Embedding for Faster and Cheaper Reasoning
13 September 2020
Yushan Zhu
Wen Zhang
Mingyang Chen
Hui Chen
Xu-Xin Cheng
Wei Zhang
Huajun Chen Zhejiang University
Re-assign community
ArXiv
PDF
HTML
Papers citing
"DualDE: Dually Distilling Knowledge Graph Embedding for Faster and Cheaper Reasoning"
4 / 4 papers shown
Title
Progressive Distillation Based on Masked Generation Feature Method for Knowledge Graph Completion
Cunhang Fan
Yujie Chen
Jun Xue
Yonghui Kong
Jianhua Tao
Zhao Lv
20
2
0
19 Jan 2024
Random Entity Quantization for Parameter-Efficient Compositional Knowledge Graph Representation
Jiaang Li
Quan Wang
Yi Liu
L. Zhang
Zhendong Mao
27
0
0
24 Oct 2023
From Wide to Deep: Dimension Lifting Network for Parameter-efficient Knowledge Graph Embedding
Borui Cai
Yong Xiang
Longxiang Gao
Di Wu
Heng Zhang
Jiongdao Jin
Tom H. Luan
21
1
0
22 Mar 2023
Graph-based Knowledge Distillation: A survey and experimental evaluation
Jing Liu
Tongya Zheng
Guanzheng Zhang
Qinfen Hao
29
8
0
27 Feb 2023
1