Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2310.11670
Cited By
Prototype-based HyperAdapter for Sample-Efficient Multi-task Tuning
18 October 2023
Hao Zhao
Jie Fu
Zhaofeng He
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Prototype-based HyperAdapter for Sample-Efficient Multi-task Tuning"
4 / 4 papers shown
Title
ATTEMPT: Parameter-Efficient Multi-task Tuning via Attentional Mixtures of Soft Prompts
Akari Asai
Mohammadreza Salehi
Matthew E. Peters
Hannaneh Hajishirzi
103
78
0
24 May 2022
SPoT: Better Frozen Model Adaptation through Soft Prompt Transfer
Tu Vu
Brian Lester
Noah Constant
Rami Al-Rfou
Daniel Matthew Cer
VLM
LRM
121
235
0
15 Oct 2021
The Power of Scale for Parameter-Efficient Prompt Tuning
Brian Lester
Rami Al-Rfou
Noah Constant
VPVLM
254
2,999
0
18 Apr 2021
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
267
6,003
0
20 Apr 2018
1