Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2007.01922
Cited By
Knowledge Distillation Beyond Model Compression
3 July 2020
F. Sarfraz
Elahe Arani
Bahram Zonooz
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Knowledge Distillation Beyond Model Compression"
7 / 7 papers shown
Title
Rotation Invariant Quantization for Model Compression
Dor-Joseph Kampeas
Yury Nahshan
Hanoch Kremer
Gil Lederman
Shira Zaloshinski
Zheng Li
E. Haleva
MQ
23
1
0
03 Mar 2023
Sparse Coding in a Dual Memory System for Lifelong Learning
F. Sarfraz
Elahe Arani
Bahram Zonooz
CLL
21
20
0
28 Dec 2022
LILA-BOTI : Leveraging Isolated Letter Accumulations By Ordering Teacher Insights for Bangla Handwriting Recognition
Md. Ismail Hossain
Mohammed Rakib
Sabbir Mollah
Fuad Rahman
Nabeel Mohammed
29
6
0
23 May 2022
Self-Distillation from the Last Mini-Batch for Consistency Regularization
Yiqing Shen
Liwu Xu
Yuzhe Yang
Yaqian Li
Yandong Guo
17
62
0
30 Mar 2022
Learning Fast, Learning Slow: A General Continual Learning Method based on Complementary Learning System
Elahe Arani
F. Sarfraz
Bahram Zonooz
CLL
93
123
0
29 Jan 2022
Distill on the Go: Online knowledge distillation in self-supervised learning
Prashant Shivaram Bhat
Elahe Arani
Bahram Zonooz
SSL
22
28
0
20 Apr 2021
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
212
474
0
12 Jun 2018
1