ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2307.11030
  4. Cited By
Cluster-aware Semi-supervised Learning: Relational Knowledge
  Distillation Provably Learns Clustering

Cluster-aware Semi-supervised Learning: Relational Knowledge Distillation Provably Learns Clustering

20 July 2023
Yijun Dong
Kevin Miller
Qiuyu Lei
Rachel A. Ward
ArXivPDFHTML

Papers citing "Cluster-aware Semi-supervised Learning: Relational Knowledge Distillation Provably Learns Clustering"

3 / 3 papers shown
Title
Learning with invariances in random features and kernel models
Learning with invariances in random features and kernel models
Song Mei
Theodor Misiakiewicz
Andrea Montanari
OOD
46
89
0
25 Feb 2021
Knowledge Distillation in Wide Neural Networks: Risk Bound, Data
  Efficiency and Imperfect Teacher
Knowledge Distillation in Wide Neural Networks: Risk Bound, Data Efficiency and Imperfect Teacher
Guangda Ji
Zhanxing Zhu
59
42
0
20 Oct 2020
Densely Connected Convolutional Networks
Densely Connected Convolutional Networks
Gao Huang
Zhuang Liu
L. V. D. van der Maaten
Kilian Q. Weinberger
PINN
3DV
267
36,371
0
25 Aug 2016
1