ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2403.10800
  4. Cited By
Model Reprogramming Outperforms Fine-tuning on Out-of-distribution Data
  in Text-Image Encoders
v1v2 (latest)

Model Reprogramming Outperforms Fine-tuning on Out-of-distribution Data in Text-Image Encoders

16 March 2024
Andrew Geng
Pin-Yu Chen
    OODD
ArXiv (abs)PDFHTML

Papers citing "Model Reprogramming Outperforms Fine-tuning on Out-of-distribution Data in Text-Image Encoders"

2 / 2 papers shown
Title
Neural Network Reprogrammability: A Unified Theme on Model Reprogramming, Prompt Tuning, and Prompt Instruction
Neural Network Reprogrammability: A Unified Theme on Model Reprogramming, Prompt Tuning, and Prompt Instruction
Zesheng Ye
C. Cai
Ruijiang Dong
Jianzhong Qi
Bingquan Shen
Pin-Yu Chen
Feng Liu
503
1
0
05 Jun 2025
CLIP-Adapter: Better Vision-Language Models with Feature Adapters
CLIP-Adapter: Better Vision-Language Models with Feature AdaptersInternational Journal of Computer Vision (IJCV), 2021
Shiyang Feng
Shijie Geng
Renrui Zhang
Teli Ma
Rongyao Fang
Zelong Li
Jiaming Song
Yu Qiao
VLMCLIP
1.0K
1,394
0
09 Oct 2021
1