Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2212.01032
Cited By
Systematic Analysis for Pretrained Language Model Priming for Parameter-Efficient Fine-tuning
2 December 2022
Shih-Cheng Huang
Shi Wang
Min-Han Shih
Saurav Sahay
Hung-yi Lee
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Systematic Analysis for Pretrained Language Model Priming for Parameter-Efficient Fine-tuning"
6 / 6 papers shown
Title
Know Where You're Going: Meta-Learning for Parameter-Efficient Fine-Tuning
Mozhdeh Gheini
Xuezhe Ma
Jonathan May
30
5
0
25 May 2022
SPoT: Better Frozen Model Adaptation through Soft Prompt Transfer
Tu Vu
Brian Lester
Noah Constant
Rami Al-Rfou
Daniel Matthew Cer
VLM
LRM
131
276
0
15 Oct 2021
P-Tuning v2: Prompt Tuning Can Be Comparable to Fine-tuning Universally Across Scales and Tasks
Xiao Liu
Kaixuan Ji
Yicheng Fu
Weng Lam Tam
Zhengxiao Du
Zhilin Yang
Jie Tang
VLM
228
780
0
14 Oct 2021
CrossFit: A Few-shot Learning Challenge for Cross-task Generalization in NLP
Qinyuan Ye
Bill Yuchen Lin
Xiang Ren
199
167
0
18 Apr 2021
The Power of Scale for Parameter-Efficient Prompt Tuning
Brian Lester
Rami Al-Rfou
Noah Constant
VPVLM
275
3,784
0
18 Apr 2021
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Chelsea Finn
Pieter Abbeel
Sergey Levine
OOD
234
11,568
0
09 Mar 2017
1