Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1910.10699
Cited By
Contrastive Representation Distillation
23 October 2019
Yonglong Tian
Dilip Krishnan
Phillip Isola
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Contrastive Representation Distillation"
11 / 611 papers shown
Title
Distilling Spikes: Knowledge Distillation in Spiking Neural Networks
R. K. Kushawaha
S. Kumar
Biplab Banerjee
R. Velmurugan
6
31
0
01 May 2020
Teacher-Class Network: A Neural Network Compression Mechanism
Shaiq Munir Malik
Muhammad Umair Haider
Fnu Mohbat
Musab Rasheed
M. Taj
12
5
0
07 Apr 2020
SuperMix: Supervising the Mixing Data Augmentation
Ali Dabouei
Sobhan Soleymani
Fariborz Taherkhani
Nasser M. Nasrabadi
11
98
0
10 Mar 2020
Knowledge distillation via adaptive instance normalization
Jing Yang
Brais Martínez
Adrian Bulat
Georgios Tzimiropoulos
13
23
0
09 Mar 2020
Knapsack Pruning with Inner Distillation
Y. Aflalo
Asaf Noy
Ming Lin
Itamar Friedman
Lihi Zelnik-Manor
3DPC
15
34
0
19 Feb 2020
Subclass Distillation
Rafael Müller
Simon Kornblith
Geoffrey E. Hinton
18
33
0
10 Feb 2020
The State of Knowledge Distillation for Classification
Fabian Ruffy
K. Chahal
14
20
0
20 Dec 2019
QUEST: Quantized embedding space for transferring knowledge
Himalaya Jain
Spyros Gidaris
N. Komodakis
P. Pérez
Matthieu Cord
16
14
0
03 Dec 2019
Contrastive Multiview Coding
Yonglong Tian
Dilip Krishnan
Phillip Isola
SSL
19
2,360
0
13 Jun 2019
ExpandNets: Linear Over-parameterization to Train Compact Convolutional Networks
Shuxuan Guo
J. Álvarez
Mathieu Salzmann
11
77
0
26 Nov 2018
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
187
473
0
12 Jun 2018
Previous
1
2
3
...
11
12
13