ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2112.01174
  4. Cited By
Multi-task Self-distillation for Graph-based Semi-Supervised Learning

Multi-task Self-distillation for Graph-based Semi-Supervised Learning

2 December 2021
Yating Ren
Junzhong Ji
Lingfeng Niu
Minglong Lei
    SSL
ArXivPDFHTML

Papers citing "Multi-task Self-distillation for Graph-based Semi-Supervised Learning"

2 / 2 papers shown
Title
A Teacher-Free Graph Knowledge Distillation Framework with Dual
  Self-Distillation
A Teacher-Free Graph Knowledge Distillation Framework with Dual Self-Distillation
Lirong Wu
Haitao Lin
Zhangyang Gao
Guojiang Zhao
Stan Z. Li
38
8
0
06 Mar 2024
Graph-based Knowledge Distillation: A survey and experimental evaluation
Graph-based Knowledge Distillation: A survey and experimental evaluation
Jing Liu
Tongya Zheng
Guanzheng Zhang
Qinfen Hao
24
8
0
27 Feb 2023
1