ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2405.14736
  4. Cited By
GIFT: Unlocking Full Potential of Labels in Distilled Dataset at Near-zero Cost

GIFT: Unlocking Full Potential of Labels in Distilled Dataset at Near-zero Cost

23 May 2024
Xinyi Shang
Peng Sun
Tao Lin
ArXivPDFHTML

Papers citing "GIFT: Unlocking Full Potential of Labels in Distilled Dataset at Near-zero Cost"

5 / 5 papers shown
Title
Generalizing Dataset Distillation via Deep Generative Prior
Generalizing Dataset Distillation via Deep Generative Prior
George Cazenavette
Tongzhou Wang
Antonio Torralba
Alexei A. Efros
Jun-Yan Zhu
DD
83
44
0
02 May 2023
Dataset Distillation via Factorization
Dataset Distillation via Factorization
Songhua Liu
Kai Wang
Xingyi Yang
Jingwen Ye
Xinchao Wang
DD
124
137
0
30 Oct 2022
Evaluating the Impact of Loss Function Variation in Deep Learning for
  Classification
Evaluating the Impact of Loss Function Variation in Deep Learning for Classification
Simon Dräger
Jannik Dunkelau
18
2
0
28 Oct 2022
Dataset Condensation via Efficient Synthetic-Data Parameterization
Dataset Condensation via Efficient Synthetic-Data Parameterization
Jang-Hyun Kim
Jinuk Kim
Seong Joon Oh
Sangdoo Yun
Hwanjun Song
Joonhyun Jeong
Jung-Woo Ha
Hyun Oh Song
DD
375
155
0
30 May 2022
Dataset Condensation with Differentiable Siamese Augmentation
Dataset Condensation with Differentiable Siamese Augmentation
Bo-Lu Zhao
Hakan Bilen
DD
183
219
0
16 Feb 2021
1