Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2310.06982
Cited By
Data Distillation Can Be Like Vodka: Distilling More Times For Better Quality
10 October 2023
Xuxi Chen
Yu Yang
Zhangyang Wang
Baharan Mirzasoleiman
DD
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Data Distillation Can Be Like Vodka: Distilling More Times For Better Quality"
8 / 8 papers shown
Title
Emphasizing Discriminative Features for Dataset Distillation in Complex Scenarios
Kai Wang
Zekai Li
Zhi-Qi Cheng
Samir Khaki
A. Sajedi
Ramakrishna Vedantam
Konstantinos N. Plataniotis
Alexander G. Hauptmann
Yang You
DD
62
4
0
22 Oct 2024
Dataset Distillation via Knowledge Distillation: Towards Efficient Self-Supervised Pre-Training of Deep Networks
S. Joshi
Jiayi Ni
Baharan Mirzasoleiman
DD
67
2
0
03 Oct 2024
ATOM: Attention Mixer for Efficient Dataset Distillation
Samir Khaki
A. Sajedi
Kai Wang
Lucy Z. Liu
Y. Lawryshyn
Konstantinos N. Plataniotis
38
3
0
02 May 2024
Generalizing Dataset Distillation via Deep Generative Prior
George Cazenavette
Tongzhou Wang
Antonio Torralba
Alexei A. Efros
Jun-Yan Zhu
DD
91
84
0
02 May 2023
Dataset Distillation via Factorization
Songhua Liu
Kai Wang
Xingyi Yang
Jingwen Ye
Xinchao Wang
DD
124
141
0
30 Oct 2022
Efficient Dataset Distillation Using Random Feature Approximation
Noel Loo
Ramin Hasani
Alexander Amini
Daniela Rus
DD
67
95
0
21 Oct 2022
Dataset Condensation via Efficient Synthetic-Data Parameterization
Jang-Hyun Kim
Jinuk Kim
Seong Joon Oh
Sangdoo Yun
Hwanjun Song
Joonhyun Jeong
Jung-Woo Ha
Hyun Oh Song
DD
378
158
0
30 May 2022
Dataset Condensation with Differentiable Siamese Augmentation
Bo-Lu Zhao
Hakan Bilen
DD
189
288
0
16 Feb 2021
1