ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.01217
  4. Cited By
ScaLearn: Simple and Highly Parameter-Efficient Task Transfer by
  Learning to Scale

ScaLearn: Simple and Highly Parameter-Efficient Task Transfer by Learning to Scale

2 October 2023
Markus Frohmann
Carolin Holtermann
Shahed Masoudian
Anne Lauscher
Navid Rekabsaz
ArXivPDFHTML

Papers citing "ScaLearn: Simple and Highly Parameter-Efficient Task Transfer by Learning to Scale"

3 / 3 papers shown
Title
Do Current Multi-Task Optimization Methods in Deep Learning Even Help?
Do Current Multi-Task Optimization Methods in Deep Learning Even Help?
Derrick Xin
Behrooz Ghorbani
Ankush Garg
Orhan Firat
Justin Gilmer
MoMe
67
61
0
23 Sep 2022
ATTEMPT: Parameter-Efficient Multi-task Tuning via Attentional Mixtures
  of Soft Prompts
ATTEMPT: Parameter-Efficient Multi-task Tuning via Attentional Mixtures of Soft Prompts
Akari Asai
Mohammadreza Salehi
Matthew E. Peters
Hannaneh Hajishirzi
120
100
0
24 May 2022
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
294
6,927
0
20 Apr 2018
1