ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2305.18381
  4. Cited By
Distill Gold from Massive Ores: Efficient Dataset Distillation via
  Critical Samples Selection

Distill Gold from Massive Ores: Efficient Dataset Distillation via Critical Samples Selection

28 May 2023
Yue Xu
Yong-Lu Li
Kaitong Cui
Ziyu Wang
Cewu Lu
Yu-Wing Tai
Chi-Keung Tang
    DD
ArXivPDFHTML

Papers citing "Distill Gold from Massive Ores: Efficient Dataset Distillation via Critical Samples Selection"

13 / 13 papers shown
Title
Emphasizing Discriminative Features for Dataset Distillation in Complex Scenarios
Emphasizing Discriminative Features for Dataset Distillation in Complex Scenarios
Kai Wang
Zekai Li
Zhi-Qi Cheng
Samir Khaki
A. Sajedi
Ramakrishna Vedantam
Konstantinos N. Plataniotis
Alexander G. Hauptmann
Yang You
DD
62
4
0
22 Oct 2024
Dataset Distillation from First Principles: Integrating Core Information
  Extraction and Purposeful Learning
Dataset Distillation from First Principles: Integrating Core Information Extraction and Purposeful Learning
Vyacheslav Kungurtsev
Yuanfang Peng
Jianyang Gu
Saeed Vahidian
Anthony Quinn
Fadwa Idlahcen
Yiran Chen
FedML
DD
33
2
0
02 Sep 2024
Prioritize Alignment in Dataset Distillation
Prioritize Alignment in Dataset Distillation
Zekai Li
Ziyao Guo
Wangbo Zhao
Tianle Zhang
Zhi-Qi Cheng
...
Kaipeng Zhang
A. Sajedi
Konstantinos N Plataniotis
Kai Wang
Yang You
DD
43
2
0
06 Aug 2024
Backdoor Graph Condensation
Backdoor Graph Condensation
Jiahao Wu
Ning Lu
Zeiyu Dai
Kun Wang
Wenqi Fan
Shengcai Liu
Qing Li
Ke Tang
AAML
DD
58
5
0
03 Jul 2024
Low-Rank Similarity Mining for Multimodal Dataset Distillation
Low-Rank Similarity Mining for Multimodal Dataset Distillation
Yue Xu
Zhilin Lin
Yusong Qiu
Cewu Lu
Yong-Lu Li
DD
41
4
0
06 Jun 2024
DD-RobustBench: An Adversarial Robustness Benchmark for Dataset
  Distillation
DD-RobustBench: An Adversarial Robustness Benchmark for Dataset Distillation
Yifan Wu
Jiawei Du
Ping Liu
Yuewei Lin
Wenqing Cheng
Wei-ping Xu
DD
AAML
38
5
0
20 Mar 2024
Improve Cross-Architecture Generalization on Dataset Distillation
Improve Cross-Architecture Generalization on Dataset Distillation
Binglin Zhou
Linhao Zhong
Wentao Chen
DD
34
4
0
20 Feb 2024
Dataset Distillation via Factorization
Dataset Distillation via Factorization
Songhua Liu
Kai Wang
Xingyi Yang
Jingwen Ye
Xinchao Wang
DD
124
141
0
30 Oct 2022
Efficient Dataset Distillation Using Random Feature Approximation
Efficient Dataset Distillation Using Random Feature Approximation
Noel Loo
Ramin Hasani
Alexander Amini
Daniela Rus
DD
67
95
0
21 Oct 2022
Dataset Condensation via Efficient Synthetic-Data Parameterization
Dataset Condensation via Efficient Synthetic-Data Parameterization
Jang-Hyun Kim
Jinuk Kim
Seong Joon Oh
Sangdoo Yun
Hwanjun Song
Joonhyun Jeong
Jung-Woo Ha
Hyun Oh Song
DD
378
158
0
30 May 2022
Dataset Pruning: Reducing Training Data by Examining Generalization
  Influence
Dataset Pruning: Reducing Training Data by Examining Generalization Influence
Shuo Yang
Zeke Xie
Hanyu Peng
Minjing Xu
Mingming Sun
P. Li
DD
144
106
0
19 May 2022
GRAD-MATCH: Gradient Matching based Data Subset Selection for Efficient
  Deep Model Training
GRAD-MATCH: Gradient Matching based Data Subset Selection for Efficient Deep Model Training
Krishnateja Killamsetty
D. Sivasubramanian
Ganesh Ramakrishnan
A. De
Rishabh K. Iyer
OOD
86
188
0
27 Feb 2021
Dataset Condensation with Differentiable Siamese Augmentation
Dataset Condensation with Differentiable Siamese Augmentation
Bo-Lu Zhao
Hakan Bilen
DD
189
288
0
16 Feb 2021
1