ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2007.12355
  4. Cited By
Dynamic Knowledge Distillation for Black-box Hypothesis Transfer
  Learning

Dynamic Knowledge Distillation for Black-box Hypothesis Transfer Learning

24 July 2020
Yiqin Yu
Xu Min
Shiwan Zhao
Jing Mei
Fei Wang
Dongsheng Li
Kenney Ng
Shaochun Li
ArXivPDFHTML

Papers citing "Dynamic Knowledge Distillation for Black-box Hypothesis Transfer Learning"

1 / 1 papers shown
Title
TOHAN: A One-step Approach towards Few-shot Hypothesis Adaptation
TOHAN: A One-step Approach towards Few-shot Hypothesis Adaptation
Haoang Chi
Feng Liu
Wenjing Yang
L. Lan
Tongliang Liu
Bo Han
William Cheung
James T. Kwok
35
27
0
11 Jun 2021
1