ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.14532
  4. Cited By
Revisiting Label Smoothing and Knowledge Distillation Compatibility:
  What was Missing?

Revisiting Label Smoothing and Knowledge Distillation Compatibility: What was Missing?

29 June 2022
Keshigeyan Chandrasegaran
Ngoc-Trung Tran
Yunqing Zhao
Ngai-man Cheung
ArXivPDFHTML

Papers citing "Revisiting Label Smoothing and Knowledge Distillation Compatibility: What was Missing?"

5 / 5 papers shown
Title
Rethinking Knowledge in Distillation: An In-context Sample Retrieval Perspective
Rethinking Knowledge in Distillation: An In-context Sample Retrieval Perspective
Jinjing Zhu
Songze Li
Lin Wang
33
0
0
13 Jan 2025
Curriculum Temperature for Knowledge Distillation
Curriculum Temperature for Knowledge Distillation
Zheng Li
Xiang Li
Lingfeng Yang
Borui Zhao
Renjie Song
Lei Luo
Jun Yu Li
Jian Yang
11
130
0
29 Nov 2022
AlphaNet: Improved Training of Supernets with Alpha-Divergence
AlphaNet: Improved Training of Supernets with Alpha-Divergence
Dilin Wang
Chengyue Gong
Meng Li
Qiang Liu
Vikas Chandra
141
37
0
16 Feb 2021
SEED: Self-supervised Distillation For Visual Representation
SEED: Self-supervised Distillation For Visual Representation
Zhiyuan Fang
Jianfeng Wang
Lijuan Wang
Lei Zhang
Yezhou Yang
Zicheng Liu
SSL
231
186
0
12 Jan 2021
Bag of Tricks for Image Classification with Convolutional Neural
  Networks
Bag of Tricks for Image Classification with Convolutional Neural Networks
Tong He
Zhi-Li Zhang
Hang Zhang
Zhongyue Zhang
Junyuan Xie
Mu Li
204
1,275
0
04 Dec 2018
1