Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2409.19291
Cited By
CLIP-MoE: Towards Building Mixture of Experts for CLIP with Diversified Multiplet Upcycling
28 September 2024
Jihai Zhang
Xiaoye Qu
Tong Zhu
Yu Cheng
Re-assign community
ArXiv
PDF
HTML
Papers citing
"CLIP-MoE: Towards Building Mixture of Experts for CLIP with Diversified Multiplet Upcycling"
4 / 4 papers shown
Title
From Head to Tail: Towards Balanced Representation in Large Vision-Language Models through Adaptive Data Calibration
Mingyang Song
Xiaoye Qu
Jiawei Zhou
Yu-Xi Cheng
VLM
50
1
0
17 Mar 2025
A Comprehensive Survey of Mixture-of-Experts: Algorithms, Theory, and Applications
Siyuan Mu
Sen Lin
MoE
84
1
0
10 Mar 2025
Make LoRA Great Again: Boosting LoRA with Adaptive Singular Values and Mixture-of-Experts Optimization Alignment
Chenghao Fan
Zhenyi Lu
Sichen Liu
Xiaoye Qu
Wei Wei
Chengfeng Gu
Yu-Xi Cheng
MoE
55
0
0
24 Feb 2025
BiomedCLIP: a multimodal biomedical foundation model pretrained from fifteen million scientific image-text pairs
Sheng Zhang
Yanbo Xu
Naoto Usuyama
Hanwen Xu
J. Bagga
...
Carlo Bifulco
M. Lungren
Tristan Naumann
Sheng Wang
Hoifung Poon
LM&MA
MedIm
151
191
0
10 Jan 2025
1