ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2403.03483
  4. Cited By
A Teacher-Free Graph Knowledge Distillation Framework with Dual
  Self-Distillation

A Teacher-Free Graph Knowledge Distillation Framework with Dual Self-Distillation

6 March 2024
Lirong Wu
Haitao Lin
Zhangyang Gao
Guojiang Zhao
Stan Z. Li
ArXiv (abs)PDFHTMLGithub (8★)

Papers citing "A Teacher-Free Graph Knowledge Distillation Framework with Dual Self-Distillation"

1 / 1 papers shown
Sparse Decomposition of Graph Neural Networks
Sparse Decomposition of Graph Neural Networks
Yaochen Hu
Mai Zeng
Ge Zhang
Pavel Rumiantsev
Liheng Ma
Yingxue Zhang
Mark Coates
502
0
0
25 Oct 2024
1
Page 1 of 1