Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2306.05628
Cited By
Quantifying the Knowledge in GNNs for Reliable Distillation into MLPs
9 June 2023
Lirong Wu
Haitao Lin
Yufei Huang
Stan Z. Li
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Quantifying the Knowledge in GNNs for Reliable Distillation into MLPs"
7 / 7 papers shown
Title
Sparse Decomposition of Graph Neural Networks
Yaochen Hu
Mai Zeng
Ge Zhang
P. Rumiantsev
Liheng Ma
Yingxue Zhang
Mark Coates
27
0
0
25 Oct 2024
Learning to Model Graph Structural Information on MLPs via Graph Structure Self-Contrasting
Lirong Wu
Haitao Lin
Guojiang Zhao
Cheng Tan
Stan Z. Li
28
0
0
09 Sep 2024
A Teacher-Free Graph Knowledge Distillation Framework with Dual Self-Distillation
Lirong Wu
Haitao Lin
Zhangyang Gao
Guojiang Zhao
Stan Z. Li
38
8
0
06 Mar 2024
MAPE-PPI: Towards Effective and Efficient Protein-Protein Interaction Prediction via Microenvironment-Aware Protein Embedding
Lirong Wu
Yijun Tian
Yufei Huang
Siyuan Li
Haitao Lin
Nitesh V. Chawla
Stan Z. Li
26
22
0
22 Feb 2024
Teaching Yourself: Graph Self-Distillation on Neighborhood for Node Classification
Lirong Wu
Jun-Xiong Xia
Haitao Lin
Zhangyang Gao
Zicheng Liu
Guojiang Zhao
Stan Z. Li
61
6
0
05 Oct 2022
Iterative Graph Self-Distillation
Hanlin Zhang
Shuai Lin
Weiyang Liu
Pan Zhou
Jian Tang
Xiaodan Liang
Eric P. Xing
SSL
52
33
0
23 Oct 2020
Distilling Knowledge from Graph Convolutional Networks
Yiding Yang
Jiayan Qiu
Mingli Song
Dacheng Tao
Xinchao Wang
146
226
0
23 Mar 2020
1