Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
2212.11353
Cited By
Contrastive Distillation Is a Sample-Efficient Self-Supervised Loss Policy for Transfer Learning
21 December 2022
Christopher T. Lengerich
Gabriel Synnaeve
Amy Zhang
Hugh Leather
Kurt Shuster
Franccois Charton
Charysse Redwood
SSL
OffRL
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Contrastive Distillation Is a Sample-Efficient Self-Supervised Loss Policy for Transfer Learning"
3 / 3 papers shown
Title
Augmented Language Models: a Survey
Grégoire Mialon
Roberto Dessì
Maria Lomeli
Christoforos Nalmpantis
Ramakanth Pasunuru
...
Jane Dwivedi-Yu
Asli Celikyilmaz
Edouard Grave
Yann LeCun
Thomas Scialom
LRM
KELM
226
476
0
15 Feb 2023
Contrastive Representation Distillation
International Conference on Learning Representations (ICLR), 2019
Yonglong Tian
Dilip Krishnan
Phillip Isola
1.0K
1,195
0
23 Oct 2019
Billion-scale similarity search with GPUs
IEEE Transactions on Big Data (TBD), 2017
Jeff Johnson
Matthijs Douze
Edouard Grave
781
4,389
0
28 Feb 2017
1