Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2205.11961
Cited By
ATTEMPT: Parameter-Efficient Multi-task Tuning via Attentional Mixtures of Soft Prompts
24 May 2022
Akari Asai
Mohammadreza Salehi
Matthew E. Peters
Hannaneh Hajishirzi
Re-assign community
ArXiv
PDF
HTML
Papers citing
"ATTEMPT: Parameter-Efficient Multi-task Tuning via Attentional Mixtures of Soft Prompts"
7 / 7 papers shown
Title
Efficient Knowledge Transfer in Multi-Task Learning through Task-Adaptive Low-Rank Representation
Xiao Zhang
Kangsheng Wang
Tianyu Hu
Huimin Ma
26
1
0
20 Apr 2025
Learning Optimal Prompt Ensemble for Multi-source Visual Prompt Transfer
Enming Zhang
Liwen Cao
Yanru Wu
Zijie Zhao
Guan Wang
Yang Li
34
0
0
09 Apr 2025
Multitask Prompted Training Enables Zero-Shot Task Generalization
Victor Sanh
Albert Webson
Colin Raffel
Stephen H. Bach
Lintang Sutawika
...
T. Bers
Stella Biderman
Leo Gao
Thomas Wolf
Alexander M. Rush
LRM
192
1,436
0
15 Oct 2021
SPoT: Better Frozen Model Adaptation through Soft Prompt Transfer
Tu Vu
Brian Lester
Noah Constant
Rami Al-Rfou
Daniel Matthew Cer
VLM
LRM
126
235
0
15 Oct 2021
Single-dataset Experts for Multi-dataset Question Answering
Dan Friedman
Ben Dodge
Danqi Chen
MoMe
99
26
0
28 Sep 2021
The Power of Scale for Parameter-Efficient Prompt Tuning
Brian Lester
Rami Al-Rfou
Noah Constant
VPVLM
265
2,999
0
18 Apr 2021
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
275
6,003
0
20 Apr 2018
1