ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2312.16242
  4. Cited By
Revisiting Knowledge Distillation under Distribution Shift

Revisiting Knowledge Distillation under Distribution Shift

25 December 2023
Songming Zhang
Ziyu Lyu
Xiaofeng Chen
ArXivPDFHTML

Papers citing "Revisiting Knowledge Distillation under Distribution Shift"

3 / 3 papers shown
Title
Knowledge Distillation with Adapted Weight
Sirong Wu
Xi Luo
Junjie Liu
Yuhui Deng
33
0
0
06 Jan 2025
A Fine-Grained Analysis on Distribution Shift
A Fine-Grained Analysis on Distribution Shift
Olivia Wiles
Sven Gowal
Florian Stimberg
Sylvestre-Alvise Rebuffi
Ira Ktena
Krishnamurthy Dvijotham
A. Cemgil
OOD
215
196
0
21 Oct 2021
Distilling Knowledge via Knowledge Review
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
147
308
0
19 Apr 2021
1