Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2110.10545
Cited By
Ranking and Tuning Pre-trained Models: A New Paradigm for Exploiting Model Hubs
20 October 2021
Kaichao You
Yong Liu
Ziyang Zhang
Jianmin Wang
Michael I. Jordan
Mingsheng Long
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Ranking and Tuning Pre-trained Models: A New Paradigm for Exploiting Model Hubs"
7 / 7 papers shown
Title
Capability Instruction Tuning: A New Paradigm for Dynamic LLM Routing
Yi-Kai Zhang
De-Chuan Zhan
Han-Jia Ye
ALM
ELM
LRM
29
1
0
24 Feb 2025
Quick-Tune: Quickly Learning Which Pretrained Model to Finetune and How
Sebastian Pineda Arango
Fabio Ferreira
Arlind Kadra
Frank Hutter
Frank Hutter Josif Grabocka
11
15
0
06 Jun 2023
MLP-Mixer: An all-MLP Architecture for Vision
Ilya O. Tolstikhin
N. Houlsby
Alexander Kolesnikov
Lucas Beyer
Xiaohua Zhai
...
Andreas Steiner
Daniel Keysers
Jakob Uszkoreit
Mario Lucic
Alexey Dosovitskiy
239
2,554
0
04 May 2021
Pre-trained Models for Natural Language Processing: A Survey
Xipeng Qiu
Tianxiang Sun
Yige Xu
Yunfan Shao
Ning Dai
Xuanjing Huang
LM&MA
VLM
224
1,281
0
18 Mar 2020
Improved Baselines with Momentum Contrastive Learning
Xinlei Chen
Haoqi Fan
Ross B. Girshick
Kaiming He
SSL
229
3,029
0
09 Mar 2020
Transferability and Hardness of Supervised Classification Tasks
Anh Tran
Cuong V Nguyen
Tal Hassner
134
163
0
21 Aug 2019
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
294
6,003
0
20 Apr 2018
1