Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2404.16637
Cited By
Zero-Shot Distillation for Image Encoders: How to Make Effective Use of Synthetic Data
25 April 2024
Niclas Popp
J. H. Metzen
Matthias Hein
VLM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Zero-Shot Distillation for Image Encoders: How to Make Effective Use of Synthetic Data"
6 / 6 papers shown
Title
SynthCLIP: Are We Ready for a Fully Synthetic CLIP Training?
Hasan Hammoud
Hani Itani
Fabio Pizzati
Philip H. S. Torr
Adel Bibi
Bernard Ghanem
CLIP
VLM
112
35
0
02 Feb 2024
Diversify, Don't Fine-Tune: Scaling Up Visual Recognition Training with Synthetic Images
Zhuoran Yu
Chenchen Zhu
Sean Culatana
Raghuraman Krishnamoorthi
Fanyi Xiao
Yong Jae Lee
109
13
0
04 Dec 2023
LCM-LoRA: A Universal Stable-Diffusion Acceleration Module
Simian Luo
Yiqin Tan
Suraj Patil
Daniel Gu
Patrick von Platen
Apolinário Passos
Longbo Huang
Jian Li
Hang Zhao
MoMe
108
144
0
09 Nov 2023
Distilling Out-of-Distribution Robustness from Vision-Language Foundation Models
Andy Zhou
Jindong Wang
Yu-xiong Wang
Haohan Wang
VLM
36
6
0
02 Nov 2023
Identification of Systematic Errors of Image Classifiers on Rare Subgroups
J. H. Metzen
Robin Hutmacher
N. G. Hua
Valentyn Boreiko
Dan Zhang
AAML
VLM
56
18
0
09 Mar 2023
BLIP-2: Bootstrapping Language-Image Pre-training with Frozen Image Encoders and Large Language Models
Junnan Li
Dongxu Li
Silvio Savarese
Steven C. H. Hoi
VLM
MLLM
259
4,223
0
30 Jan 2023
1