ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2111.11747
  4. Cited By
Semi-Online Knowledge Distillation

Semi-Online Knowledge Distillation

23 November 2021
Zhiqiang Liu
Yanxia Liu
Chengkai Huang
ArXivPDFHTML

Papers citing "Semi-Online Knowledge Distillation"

5 / 5 papers shown
Title
AGTGAN: Unpaired Image Translation for Photographic Ancient Character
  Generation
AGTGAN: Unpaired Image Translation for Photographic Ancient Character Generation
Hongxiang Huang
Daihui Yang
Gang Dai
Zhen Han
Yuyi Wang
K. Lam
Fan Yang
Shuangping Huang
Yong-ge Liu
Mengchao He
GAN
19
18
0
13 Mar 2023
Parameter-Efficient and Student-Friendly Knowledge Distillation
Parameter-Efficient and Student-Friendly Knowledge Distillation
Jun Rao
Xv Meng
Liang Ding
Shuhan Qi
Dacheng Tao
37
46
0
28 May 2022
Improved Knowledge Distillation via Adversarial Collaboration
Improved Knowledge Distillation via Adversarial Collaboration
Zhiqiang Liu
Chengkai Huang
Yanxia Liu
31
2
0
29 Nov 2021
Knowledge Distillation by On-the-Fly Native Ensemble
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
212
474
0
12 Jun 2018
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision
  Applications
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
950
20,599
0
17 Apr 2017
1