ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2302.12360
  4. Cited By
Practical Knowledge Distillation: Using DNNs to Beat DNNs

Practical Knowledge Distillation: Using DNNs to Beat DNNs

23 February 2023
Chungman Lee
Pavlos Anastasios Apostolopulos
Igor L. Markov
    FedML
ArXivPDFHTML

Papers citing "Practical Knowledge Distillation: Using DNNs to Beat DNNs"

4 / 4 papers shown
Title
Tree-Regularized Tabular Embeddings
Tree-Regularized Tabular Embeddings
Xuan Li
Yunhe Wang
Boqian Li
LMTD
33
3
0
01 Mar 2024
Looper: An end-to-end ML platform for product decisions
Looper: An end-to-end ML platform for product decisions
I. Markov
Hanson Wang
Nitya Kasturi
Shaun Singh
Szeto Wai Yuen
...
Michael Belkin
Sal Uryasev
Sam Howie
E. Bakshy
Norm Zhou
OffRL
20
15
0
14 Oct 2021
A Survey on Ensemble Learning under the Era of Deep Learning
A Survey on Ensemble Learning under the Era of Deep Learning
Yongquan Yang
Haijun Lv
Ning Chen
OOD
59
179
0
21 Jan 2021
New Properties of the Data Distillation Method When Working With Tabular
  Data
New Properties of the Data Distillation Method When Working With Tabular Data
Dmitry Medvedev
A. Dýakonov
DD
14
9
0
19 Oct 2020
1