ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2404.12037
  4. Cited By
Data-free Knowledge Distillation for Fine-grained Visual Categorization

Data-free Knowledge Distillation for Fine-grained Visual Categorization

18 April 2024
Renrong Shao
Wei Zhang
Jianhua Yin
Jun Wang
ArXiv (abs)PDFHTML

Papers citing "Data-free Knowledge Distillation for Fine-grained Visual Categorization"

4 / 4 papers shown
Title
Sparse Model Inversion: Efficient Inversion of Vision Transformers for Data-Free Applications
Sparse Model Inversion: Efficient Inversion of Vision Transformers for Data-Free ApplicationsInternational Conference on Machine Learning (ICML), 2025
Zixuan Hu
Yongxian Wei
Li Shen
Zhenyi Wang
Lei Li
Chun Yuan
Dacheng Tao
116
8
0
31 Oct 2025
Mosaic: Data-Free Knowledge Distillation via Mixture-of-Experts for Heterogeneous Distributed Environments
Mosaic: Data-Free Knowledge Distillation via Mixture-of-Experts for Heterogeneous Distributed Environments
Junming Liu
Yanting Gao
Siyuan Meng
Yifei Sun
Aoqi Wu
Yufei Jin
Yirong Chen
Botian Shi
Guosun Zeng
233
2
0
26 May 2025
Dataset Size Recovery from LoRA Weights
Dataset Size Recovery from LoRA Weights
Mohammad Salama
Jonathan Kahana
Eliahu Horwitz
Yedid Hoshen
223
6
0
27 Jun 2024
Contrastive Representation Distillation
Contrastive Representation DistillationInternational Conference on Learning Representations (ICLR), 2019
Yonglong Tian
Dilip Krishnan
Phillip Isola
1.2K
1,198
0
23 Oct 2019
1