ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2106.08541
  4. Cited By
Distilling Self-Knowledge From Contrastive Links to Classify Graph Nodes
  Without Passing Messages

Distilling Self-Knowledge From Contrastive Links to Classify Graph Nodes Without Passing Messages

16 June 2021
Yi Luo
Aiguo Chen
Ke Yan
Ling Tian
ArXivPDFHTML

Papers citing "Distilling Self-Knowledge From Contrastive Links to Classify Graph Nodes Without Passing Messages"

10 / 10 papers shown
Title
Learning to Model Graph Structural Information on MLPs via Graph
  Structure Self-Contrasting
Learning to Model Graph Structural Information on MLPs via Graph Structure Self-Contrasting
Lirong Wu
Haitao Lin
Guojiang Zhao
Cheng Tan
Stan Z. Li
28
0
0
09 Sep 2024
A Teacher-Free Graph Knowledge Distillation Framework with Dual
  Self-Distillation
A Teacher-Free Graph Knowledge Distillation Framework with Dual Self-Distillation
Lirong Wu
Haitao Lin
Zhangyang Gao
Guojiang Zhao
Stan Z. Li
40
8
0
06 Mar 2024
Clarify Confused Nodes via Separated Learning
Clarify Confused Nodes via Separated Learning
Jiajun Zhou
Sheng Gong
Chenxuan Xie
Shanqing Yu
Qi Xuan
Xiaoniu Yang
Xiaoniu Yang
81
3
0
04 Jun 2023
Extracting Low-/High- Frequency Knowledge from Graph Neural Networks and
  Injecting it into MLPs: An Effective GNN-to-MLP Distillation Framework
Extracting Low-/High- Frequency Knowledge from Graph Neural Networks and Injecting it into MLPs: An Effective GNN-to-MLP Distillation Framework
Lirong Wu
Haitao Lin
Yufei Huang
Tianyu Fan
Stan Z. Li
24
29
0
18 May 2023
Graph-based Knowledge Distillation: A survey and experimental evaluation
Graph-based Knowledge Distillation: A survey and experimental evaluation
Jing Liu
Tongya Zheng
Guanzheng Zhang
Qinfen Hao
33
8
0
27 Feb 2023
Automated Graph Self-supervised Learning via Multi-teacher Knowledge
  Distillation
Automated Graph Self-supervised Learning via Multi-teacher Knowledge Distillation
Lirong Wu
Yufei Huang
Haitao Lin
Zicheng Liu
Tianyu Fan
Stan Z. Li
SSL
56
5
0
05 Oct 2022
Teaching Yourself: Graph Self-Distillation on Neighborhood for Node
  Classification
Teaching Yourself: Graph Self-Distillation on Neighborhood for Node Classification
Lirong Wu
Jun Xia
Haitao Lin
Zhangyang Gao
Zicheng Liu
Guojiang Zhao
Stan Z. Li
61
6
0
05 Oct 2022
Graph Decipher: A transparent dual-attention graph neural network to
  understand the message-passing mechanism for the node classification
Graph Decipher: A transparent dual-attention graph neural network to understand the message-passing mechanism for the node classification
Yan Pang
Chao Liu
GNN
26
8
0
04 Jan 2022
Local Augmentation for Graph Neural Networks
Local Augmentation for Graph Neural Networks
Songtao Liu
Rex Ying
Hanze Dong
Lanqing Li
Tingyang Xu
Yu Rong
P. Zhao
Junzhou Huang
Dinghao Wu
45
91
0
08 Sep 2021
Improving neural networks by preventing co-adaptation of feature
  detectors
Improving neural networks by preventing co-adaptation of feature detectors
Geoffrey E. Hinton
Nitish Srivastava
A. Krizhevsky
Ilya Sutskever
Ruslan Salakhutdinov
VLM
266
7,638
0
03 Jul 2012
1