ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2006.16205
  4. Cited By
Composed Fine-Tuning: Freezing Pre-Trained Denoising Autoencoders for
  Improved Generalization

Composed Fine-Tuning: Freezing Pre-Trained Denoising Autoencoders for Improved Generalization

29 June 2020
Sang Michael Xie
Tengyu Ma
Percy Liang
ArXivPDFHTML

Papers citing "Composed Fine-Tuning: Freezing Pre-Trained Denoising Autoencoders for Improved Generalization"

4 / 4 papers shown
Title
Open Domain Generalization with a Single Network by Regularization
  Exploiting Pre-trained Features
Open Domain Generalization with a Single Network by Regularization Exploiting Pre-trained Features
Inseop Chung
Kiyoon Yoo
Nojun Kwak
VLM
14
0
0
08 Dec 2023
Backdoor Learning for NLP: Recent Advances, Challenges, and Future
  Research Directions
Backdoor Learning for NLP: Recent Advances, Challenges, and Future Research Directions
Marwan Omar
SILM
AAML
23
20
0
14 Feb 2023
The Power of Scale for Parameter-Efficient Prompt Tuning
The Power of Scale for Parameter-Efficient Prompt Tuning
Brian Lester
Rami Al-Rfou
Noah Constant
VPVLM
280
3,843
0
18 Apr 2021
Mixout: Effective Regularization to Finetune Large-scale Pretrained
  Language Models
Mixout: Effective Regularization to Finetune Large-scale Pretrained Language Models
Cheolhyoung Lee
Kyunghyun Cho
Wanmo Kang
MoE
235
205
0
25 Sep 2019
1