ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2306.10687
  4. Cited By
Categories of Response-Based, Feature-Based, and Relation-Based
  Knowledge Distillation

Categories of Response-Based, Feature-Based, and Relation-Based Knowledge Distillation

19 June 2023
Chuanguang Yang
Xinqiang Yu
Zhulin An
Yongjun Xu
    VLM
    OffRL
ArXivPDFHTML

Papers citing "Categories of Response-Based, Feature-Based, and Relation-Based Knowledge Distillation"

13 / 13 papers shown
Title
Finger Pose Estimation for Under-screen Fingerprint Sensor
Finger Pose Estimation for Under-screen Fingerprint Sensor
Xiongjun Guan
Zhiyu Pan
Jianjiang Feng
Jie Zhou
41
1
0
05 May 2025
From Large to Super-Tiny: End-to-End Optimization for Cost-Efficient LLMs
From Large to Super-Tiny: End-to-End Optimization for Cost-Efficient LLMs
Jiliang Ni
Jiachen Pu
Zhongyi Yang
Kun Zhou
Hui Wang
Xiaoliang Xiao
Dakui Wang
Xin Li
Jingfeng Luo
Conggang Hu
32
0
0
18 Apr 2025
I2CKD : Intra- and Inter-Class Knowledge Distillation for Semantic Segmentation
I2CKD : Intra- and Inter-Class Knowledge Distillation for Semantic Segmentation
Ayoub Karine
Thibault Napoléon
M. Jridi
VLM
99
0
0
24 Feb 2025
ViTKD: Practical Guidelines for ViT feature knowledge distillation
ViTKD: Practical Guidelines for ViT feature knowledge distillation
Zhendong Yang
Zhe Li
Ailing Zeng
Zexian Li
Chun Yuan
Yu Li
84
42
0
06 Sep 2022
Emerging Properties in Self-Supervised Vision Transformers
Emerging Properties in Self-Supervised Vision Transformers
Mathilde Caron
Hugo Touvron
Ishan Misra
Hervé Jégou
Julien Mairal
Piotr Bojanowski
Armand Joulin
292
5,761
0
29 Apr 2021
Self-distillation with Batch Knowledge Ensembling Improves ImageNet
  Classification
Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification
Yixiao Ge
Xiao Zhang
Ching Lam Choi
Ka Chun Cheung
Peipei Zhao
Feng Zhu
Xiaogang Wang
Rui Zhao
Hongsheng Li
FedML
UQCV
127
25
0
27 Apr 2021
Mutual Contrastive Learning for Visual Representation Learning
Mutual Contrastive Learning for Visual Representation Learning
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
VLM
SSL
97
74
0
26 Apr 2021
Distilling Knowledge via Knowledge Review
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
147
416
0
19 Apr 2021
Learning Student-Friendly Teacher Networks for Knowledge Distillation
Learning Student-Friendly Teacher Networks for Knowledge Distillation
D. Park
Moonsu Cha
C. Jeong
Daesin Kim
Bohyung Han
113
100
0
12 Feb 2021
Show, Attend and Distill:Knowledge Distillation via Attention-based
  Feature Matching
Show, Attend and Distill:Knowledge Distillation via Attention-based Feature Matching
Mingi Ji
Byeongho Heo
Sungrae Park
56
142
0
05 Feb 2021
Distilling portable Generative Adversarial Networks for Image
  Translation
Distilling portable Generative Adversarial Networks for Image Translation
Hanting Chen
Yunhe Wang
Han Shu
Changyuan Wen
Chunjing Xu
Boxin Shi
Chao Xu
Chang Xu
78
83
0
07 Mar 2020
Knowledge Distillation by On-the-Fly Native Ensemble
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
187
472
0
12 Jun 2018
Large scale distributed neural network training through online
  distillation
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
267
404
0
09 Apr 2018
1