ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2211.16231
  4. Cited By
Curriculum Temperature for Knowledge Distillation

Curriculum Temperature for Knowledge Distillation

29 November 2022
Zheng Li
Xiang Li
Lingfeng Yang
Borui Zhao
Renjie Song
Lei Luo
Jun Yu Li
Jian Yang
ArXivPDFHTML

Papers citing "Curriculum Temperature for Knowledge Distillation"

5 / 55 papers shown
Title
Swing Distillation: A Privacy-Preserving Knowledge Distillation
  Framework
Swing Distillation: A Privacy-Preserving Knowledge Distillation Framework
Junzhuo Li
Xinwei Wu
Weilong Dong
Shuangzhi Wu
Chao Bian
Deyi Xiong
21
3
0
16 Dec 2022
Revisiting Label Smoothing and Knowledge Distillation Compatibility:
  What was Missing?
Revisiting Label Smoothing and Knowledge Distillation Compatibility: What was Missing?
Keshigeyan Chandrasegaran
Ngoc-Trung Tran
Yunqing Zhao
Ngai-man Cheung
75
41
0
29 Jun 2022
Distilling Knowledge via Knowledge Review
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
147
416
0
19 Apr 2021
Curriculum DeepSDF
Curriculum DeepSDF
Yueqi Duan
Haidong Zhu
He-Nan Wang
L. Yi
Ram Nevatia
Leonidas J. Guibas
60
93
0
19 Mar 2020
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision
  Applications
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
948
20,214
0
17 Apr 2017
Previous
12