ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2204.05248
  4. Cited By
Learning Downstream Task by Selectively Capturing Complementary
  Knowledge from Multiple Self-supervisedly Learning Pretexts

Learning Downstream Task by Selectively Capturing Complementary Knowledge from Multiple Self-supervisedly Learning Pretexts

11 April 2022
Jiayu Yao
Qingyuan Wu
Quan Feng
Songcan Chen
    SSL
ArXivPDFHTML

Papers citing "Learning Downstream Task by Selectively Capturing Complementary Knowledge from Multiple Self-supervisedly Learning Pretexts"

2 / 2 papers shown
Title
Which Model to Transfer? Finding the Needle in the Growing Haystack
Which Model to Transfer? Finding the Needle in the Growing Haystack
Cédric Renggli
André Susano Pinto
Luka Rimanic
J. Puigcerver
C. Riquelme
Ce Zhang
Mario Lucic
23
23
0
13 Oct 2020
Boosting Self-Supervised Learning via Knowledge Transfer
Boosting Self-Supervised Learning via Knowledge Transfer
M. Noroozi
Ananth Vinjimoor
Paolo Favaro
Hamed Pirsiavash
SSL
209
292
0
01 May 2018
1