ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1510.02437
  4. Cited By
Distilling Model Knowledge

Distilling Model Knowledge

8 October 2015
George Papamakarios
    BDL
ArXivPDFHTML

Papers citing "Distilling Model Knowledge"

10 / 10 papers shown
Title
Knowledge Distillation Using Frontier Open-source LLMs: Generalizability
  and the Role of Synthetic Data
Knowledge Distillation Using Frontier Open-source LLMs: Generalizability and the Role of Synthetic Data
Anup Shirgaonkar
Nikhil Pandey
Nazmiye Ceren Abay
Tolga Aktas
Vijay Aski
ALM
SyDa
34
0
0
24 Oct 2024
Functional Ensemble Distillation
Functional Ensemble Distillation
Coby Penso
Idan Achituve
Ethan Fetaya
FedML
33
2
0
05 Jun 2022
OBoW: Online Bag-of-Visual-Words Generation for Self-Supervised Learning
OBoW: Online Bag-of-Visual-Words Generation for Self-Supervised Learning
Spyros Gidaris
Andrei Bursuc
Gilles Puy
N. Komodakis
Matthieu Cord
P. Pérez
SSL
30
70
0
21 Dec 2020
Bypass Enhancement RGB Stream Model for Pedestrian Action Recognition of
  Autonomous Vehicles
Bypass Enhancement RGB Stream Model for Pedestrian Action Recognition of Autonomous Vehicles
Dong Cao
Lisha Xu
17
2
0
15 Aug 2019
Ensemble Distribution Distillation
Ensemble Distribution Distillation
A. Malinin
Bruno Mlodozeniec
Mark Gales
UQCV
27
230
0
30 Apr 2019
Non-Iterative Knowledge Fusion in Deep Convolutional Neural Networks
Non-Iterative Knowledge Fusion in Deep Convolutional Neural Networks
M. Leontev
V. Islenteva
S. Sukhov
MoMe
FedML
19
24
0
25 Sep 2018
Model Distillation with Knowledge Transfer from Face Classification to
  Alignment and Verification
Model Distillation with Knowledge Transfer from Face Classification to Alignment and Verification
Chong-Jun Wang
Xipeng Lan
Yang Zhang
CVBM
15
26
0
09 Sep 2017
Knowledge distillation using unlabeled mismatched images
Knowledge distillation using unlabeled mismatched images
Mandar M. Kulkarni
Kalpesh Patil
Shirish S. Karande
48
16
0
21 Mar 2017
A scalable convolutional neural network for task-specified scenarios via
  knowledge distillation
A scalable convolutional neural network for task-specified scenarios via knowledge distillation
Mengnan Shi
F. Qin
QiXiang Ye
Zhenjun Han
Jianbin Jiao
16
5
0
19 Sep 2016
MCMC using Hamiltonian dynamics
MCMC using Hamiltonian dynamics
Radford M. Neal
185
3,267
0
09 Jun 2012
1