ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2305.14113
  4. Cited By
On the Size and Approximation Error of Distilled Sets

On the Size and Approximation Error of Distilled Sets

23 May 2023
Alaa Maalouf
M. Tukan
Noel Loo
Ramin Hasani
Mathias Lechner
Daniela Rus
    DD
ArXivPDFHTML

Papers citing "On the Size and Approximation Error of Distilled Sets"

5 / 5 papers shown
Title
Understanding Dataset Distillation via Spectral Filtering
Deyu Bo
Songhua Liu
Xinchao Wang
DD
75
0
0
03 Mar 2025
Emphasizing Discriminative Features for Dataset Distillation in Complex Scenarios
Emphasizing Discriminative Features for Dataset Distillation in Complex Scenarios
Kai Wang
Zekai Li
Zhi-Qi Cheng
Samir Khaki
A. Sajedi
Ramakrishna Vedantam
Konstantinos N. Plataniotis
Alexander G. Hauptmann
Yang You
DD
62
4
0
22 Oct 2024
What is Dataset Distillation Learning?
What is Dataset Distillation Learning?
William Yang
Ye Zhu
Zhiwei Deng
Olga Russakovsky
DD
31
3
0
06 Jun 2024
Efficient Dataset Distillation Using Random Feature Approximation
Efficient Dataset Distillation Using Random Feature Approximation
Noel Loo
Ramin Hasani
Alexander Amini
Daniela Rus
DD
64
95
0
21 Oct 2022
Dataset Condensation with Differentiable Siamese Augmentation
Dataset Condensation with Differentiable Siamese Augmentation
Bo-Lu Zhao
Hakan Bilen
DD
189
288
0
16 Feb 2021
1