ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2112.00725
  4. Cited By
The Augmented Image Prior: Distilling 1000 Classes by Extrapolating from
  a Single Image

The Augmented Image Prior: Distilling 1000 Classes by Extrapolating from a Single Image

1 December 2021
Yuki M. Asano
Aaqib Saeed
ArXivPDFHTML

Papers citing "The Augmented Image Prior: Distilling 1000 Classes by Extrapolating from a Single Image"

6 / 6 papers shown
Title
Federated Learning with a Single Shared Image
Federated Learning with a Single Shared Image
Sunny Soni
Aaqib Saeed
Yuki M. Asano
FedML
DD
28
1
0
18 Jun 2024
Communication-Efficient Federated Learning through Adaptive Weight
  Clustering and Server-Side Distillation
Communication-Efficient Federated Learning through Adaptive Weight Clustering and Server-Side Distillation
Vasileios Tsouvalas
Aaqib Saeed
T. Ozcelebi
N. Meratnia
FedML
16
6
0
25 Jan 2024
SynthDistill: Face Recognition with Knowledge Distillation from
  Synthetic Data
SynthDistill: Face Recognition with Knowledge Distillation from Synthetic Data
Hatef Otroshi
Anjith George
S´ebastien Marcel
17
10
0
28 Aug 2023
Is Synthetic Data From Diffusion Models Ready for Knowledge
  Distillation?
Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?
Zheng Li
Yuxuan Li
Penghai Zhao
Renjie Song
Xiang Li
Jian Yang
21
19
0
22 May 2023
Dataset Condensation with Differentiable Siamese Augmentation
Dataset Condensation with Differentiable Siamese Augmentation
Bo-Lu Zhao
Hakan Bilen
DD
186
219
0
16 Feb 2021
Pre-training without Natural Images
Pre-training without Natural Images
Hirokatsu Kataoka
Kazushige Okayasu
Asato Matsumoto
Eisuke Yamagata
Ryosuke Yamada
Nakamasa Inoue
Akio Nakamura
Y. Satoh
76
115
0
21 Jan 2021
1