ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2010.09839
  4. Cited By
New Properties of the Data Distillation Method When Working With Tabular
  Data

New Properties of the Data Distillation Method When Working With Tabular Data

International Joint Conference on the Analysis of Images, Social Networks and Texts (AISNT), 2020
19 October 2020
Dmitry Medvedev
A. Dýakonov
    DD
ArXiv (abs)PDFHTML

Papers citing "New Properties of the Data Distillation Method When Working With Tabular Data"

5 / 5 papers shown
On Learning Representations for Tabular Data Distillation
On Learning Representations for Tabular Data Distillation
Inwon Kang
Parikshit Ram
Yi Zhou
Horst Samulowitz
Oshani Seneviratne
DD
284
2
0
23 Jan 2025
Practical Knowledge Distillation: Using DNNs to Beat DNNs
Practical Knowledge Distillation: Using DNNs to Beat DNNs
Chungman Lee
Pavlos Anastasios Apostolopulos
Igor L. Markov
FedML
217
2
0
23 Feb 2023
A Comprehensive Survey of Dataset Distillation
A Comprehensive Survey of Dataset DistillationIEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), 2023
Shiye Lei
Dacheng Tao
DD
666
161
0
13 Jan 2023
Learning to Generate Synthetic Training Data using Gradient Matching and
  Implicit Differentiation
Learning to Generate Synthetic Training Data using Gradient Matching and Implicit DifferentiationInternational Joint Conference on the Analysis of Images, Social Networks and Texts (AISNT), 2022
Dmitry Medvedev
A. Dýakonov
DD
151
11
0
16 Mar 2022
Deep Neural Networks and Tabular Data: A Survey
Deep Neural Networks and Tabular Data: A Survey
V. Borisov
Tobias Leemann
Kathrin Seßler
Johannes Haug
Martin Pawelczyk
Gjergji Kasneci
LMTD
682
1,068
0
05 Oct 2021
1
Page 1 of 1