ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2107.01412
  4. Cited By
Isotonic Data Augmentation for Knowledge Distillation

Isotonic Data Augmentation for Knowledge Distillation

3 July 2021
Wanyun Cui
Sen Yan
ArXivPDFHTML

Papers citing "Isotonic Data Augmentation for Knowledge Distillation"

3 / 3 papers shown
Title
Cross-View Consistency Regularisation for Knowledge Distillation
Cross-View Consistency Regularisation for Knowledge Distillation
W. Zhang
Dongnan Liu
Weidong Cai
Chao Ma
68
1
0
21 Dec 2024
Robustness-Reinforced Knowledge Distillation with Correlation Distance
  and Network Pruning
Robustness-Reinforced Knowledge Distillation with Correlation Distance and Network Pruning
Seonghak Kim
Gyeongdo Ham
Yucheol Cho
Daeshik Kim
22
2
0
23 Nov 2023
Meta Knowledge Distillation
Meta Knowledge Distillation
Jihao Liu
Boxiao Liu
Hongsheng Li
Yu Liu
18
25
0
16 Feb 2022
1