Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2403.03483
Cited By
A Teacher-Free Graph Knowledge Distillation Framework with Dual Self-Distillation
6 March 2024
Lirong Wu
Haitao Lin
Zhangyang Gao
Guojiang Zhao
Stan Z. Li
Re-assign community
ArXiv
PDF
HTML
Papers citing
"A Teacher-Free Graph Knowledge Distillation Framework with Dual Self-Distillation"
6 / 6 papers shown
Title
Sparse Decomposition of Graph Neural Networks
Yaochen Hu
Mai Zeng
Ge Zhang
P. Rumiantsev
Liheng Ma
Yingxue Zhang
Mark Coates
22
0
0
25 Oct 2024
Exploring and Enhancing the Transfer of Distribution in Knowledge Distillation for Autoregressive Language Models
Jun Rao
Xuebo Liu
Zepeng Lin
Liang Ding
Jing Li
Dacheng Tao
Min Zhang
23
2
0
19 Sep 2024
Teach Harder, Learn Poorer: Rethinking Hard Sample Distillation for GNN-to-MLP Knowledge Distillation
Lirong Wu
Yunfan Liu
Haitao Lin
Yufei Huang
Stan Z. Li
22
1
0
20 Jul 2024
Conditional Local Convolution for Spatio-temporal Meteorological Forecasting
Haitao Lin
Zhangyang Gao
Yongjie Xu
Lirong Wu
Ling Li
Stan. Z. Li
AI4TS
116
49
0
04 Jan 2021
Iterative Graph Self-Distillation
Hanlin Zhang
Shuai Lin
Weiyang Liu
Pan Zhou
Jian Tang
Xiaodan Liang
Eric P. Xing
SSL
34
33
0
23 Oct 2020
Distilling Knowledge from Graph Convolutional Networks
Yiding Yang
Jiayan Qiu
Mingli Song
Dacheng Tao
Xinchao Wang
138
222
0
23 Mar 2020
1