Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1510.02437
Cited By
Distilling Model Knowledge
8 October 2015
George Papamakarios
BDL
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Distilling Model Knowledge"
10 / 10 papers shown
Title
Knowledge Distillation Using Frontier Open-source LLMs: Generalizability and the Role of Synthetic Data
Anup Shirgaonkar
Nikhil Pandey
Nazmiye Ceren Abay
Tolga Aktas
Vijay Aski
ALM
SyDa
34
0
0
24 Oct 2024
Functional Ensemble Distillation
Coby Penso
Idan Achituve
Ethan Fetaya
FedML
33
2
0
05 Jun 2022
OBoW: Online Bag-of-Visual-Words Generation for Self-Supervised Learning
Spyros Gidaris
Andrei Bursuc
Gilles Puy
N. Komodakis
Matthieu Cord
P. Pérez
SSL
30
70
0
21 Dec 2020
Bypass Enhancement RGB Stream Model for Pedestrian Action Recognition of Autonomous Vehicles
Dong Cao
Lisha Xu
17
2
0
15 Aug 2019
Ensemble Distribution Distillation
A. Malinin
Bruno Mlodozeniec
Mark Gales
UQCV
27
230
0
30 Apr 2019
Non-Iterative Knowledge Fusion in Deep Convolutional Neural Networks
M. Leontev
V. Islenteva
S. Sukhov
MoMe
FedML
19
24
0
25 Sep 2018
Model Distillation with Knowledge Transfer from Face Classification to Alignment and Verification
Chong-Jun Wang
Xipeng Lan
Yang Zhang
CVBM
15
26
0
09 Sep 2017
Knowledge distillation using unlabeled mismatched images
Mandar M. Kulkarni
Kalpesh Patil
Shirish S. Karande
48
16
0
21 Mar 2017
A scalable convolutional neural network for task-specified scenarios via knowledge distillation
Mengnan Shi
F. Qin
QiXiang Ye
Zhenjun Han
Jianbin Jiao
16
5
0
19 Sep 2016
MCMC using Hamiltonian dynamics
Radford M. Neal
185
3,267
0
09 Jun 2012
1