ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2308.04268
  4. Cited By
Teacher-Student Architecture for Knowledge Distillation: A Survey

Teacher-Student Architecture for Knowledge Distillation: A Survey

8 August 2023
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
ArXivPDFHTML

Papers citing "Teacher-Student Architecture for Knowledge Distillation: A Survey"

17 / 17 papers shown
Title
Neuroplasticity in Artificial Intelligence -- An Overview and Inspirations on Drop In & Out Learning
Neuroplasticity in Artificial Intelligence -- An Overview and Inspirations on Drop In & Out Learning
Yupei Li
M. Milling
Björn Schuller
AI4CE
102
0
0
27 Mar 2025
Adaptive Temperature Based on Logits Correlation in Knowledge Distillation
Kazuhiro Matsuyama
Usman Anjum
Satoko Matsuyama
Tetsuo Shoda
J. Zhan
50
0
0
12 Mar 2025
SciceVPR: Stable Cross-Image Correlation Enhanced Model for Visual Place Recognition
SciceVPR: Stable Cross-Image Correlation Enhanced Model for Visual Place Recognition
Shanshan Wan
Yingmei Wei
Lai Kang
Tianrui Shen
Haixuan Wang
Yee-Hong Yang
35
0
0
28 Feb 2025
MobileIQA: Exploiting Mobile-level Diverse Opinion Network For
  No-Reference Image Quality Assessment Using Knowledge Distillation
MobileIQA: Exploiting Mobile-level Diverse Opinion Network For No-Reference Image Quality Assessment Using Knowledge Distillation
Zewen Chen
Sunhan Xu
Yun Zeng
Haochen Guo
Jian Guo
...
Juan Wang
Bing Li
Weiming Hu
Dehua Liu
H. Li
29
1
0
02 Sep 2024
Online pre-training with long-form videos
Online pre-training with long-form videos
Itsuki Kato
Kodai Kamiya
Toru Tamaki
OnRL
16
0
0
28 Aug 2024
POA: Pre-training Once for Models of All Sizes
POA: Pre-training Once for Models of All Sizes
Yingying Zhang
Xin Guo
Jiangwei Lao
Lei Yu
Lixiang Ru
Jian Wang
Guo Ye
Huimei He
Jingdong Chen
Ming Yang
47
1
0
02 Aug 2024
Make a Strong Teacher with Label Assistance: A Novel Knowledge
  Distillation Approach for Semantic Segmentation
Make a Strong Teacher with Label Assistance: A Novel Knowledge Distillation Approach for Semantic Segmentation
Shoumeng Qiu
Jie Chen
Xinrun Li
Ru Wan
Xiangyang Xue
Jian Pu
VLM
22
3
0
18 Jul 2024
GeoWATCH for Detecting Heavy Construction in Heterogeneous Time Series
  of Satellite Images
GeoWATCH for Detecting Heavy Construction in Heterogeneous Time Series of Satellite Images
Jon Crall
Connor Greenwell
David Joy
Matthew J. Leotta
Aashish Chaudhary
A. Hoogs
AI4TS
13
0
0
08 Jul 2024
Survey on Knowledge Distillation for Large Language Models: Methods,
  Evaluation, and Application
Survey on Knowledge Distillation for Large Language Models: Methods, Evaluation, and Application
Chuanpeng Yang
Wang Lu
Yao Zhu
Yidong Wang
Qian Chen
Chenlong Gao
Bingjie Yan
Yiqiang Chen
ALM
KELM
33
3
0
02 Jul 2024
Weak-to-Strong 3D Object Detection with X-Ray Distillation
Weak-to-Strong 3D Object Detection with X-Ray Distillation
Alexander Gambashidze
Aleksandr Dadukin
Maksim Golyadkin
Maria Razzhivina
Ilya Makarov
22
2
0
31 Mar 2024
Less or More From Teacher: Exploiting Trilateral Geometry For Knowledge
  Distillation
Less or More From Teacher: Exploiting Trilateral Geometry For Knowledge Distillation
Chengming Hu
Haolun Wu
Xuan Li
Chen-li Ma
Xi Chen
Jun Yan
Boyu Wang
Xue Liu
14
2
0
22 Dec 2023
Knowledge Transfer from Vision Foundation Models for Efficient Training
  of Small Task-specific Models
Knowledge Transfer from Vision Foundation Models for Efficient Training of Small Task-specific Models
Raviteja Vemulapalli
Hadi Pouransari
Fartash Faghri
Sachin Mehta
Mehrdad Farajtabar
Mohammad Rastegari
Oncel Tuzel
15
0
0
30 Nov 2023
Cross-Task Knowledge Distillation in Multi-Task Recommendation
Cross-Task Knowledge Distillation in Multi-Task Recommendation
Chenxiao Yang
Junwei Pan
Xiaofeng Gao
Tingyu Jiang
Dapeng Liu
Guihai Chen
23
44
0
20 Feb 2022
Localization Distillation for Dense Object Detection
Localization Distillation for Dense Object Detection
Zhaohui Zheng
Rongguang Ye
Ping Wang
Dongwei Ren
W. Zuo
Qibin Hou
Ming-Ming Cheng
ObjD
78
111
0
24 Feb 2021
Learning Student-Friendly Teacher Networks for Knowledge Distillation
Learning Student-Friendly Teacher Networks for Knowledge Distillation
D. Park
Moonsu Cha
C. Jeong
Daesin Kim
Bohyung Han
111
99
0
12 Feb 2021
Distilling Knowledge from Graph Convolutional Networks
Distilling Knowledge from Graph Convolutional Networks
Yiding Yang
Jiayan Qiu
Mingli Song
Dacheng Tao
Xinchao Wang
138
222
0
23 Mar 2020
Knowledge Distillation by On-the-Fly Native Ensemble
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
187
436
0
12 Jun 2018
1