Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2103.14431
Cited By
Multimodal Knowledge Expansion
26 March 2021
Zihui Xue
Sucheng Ren
Zhengqi Gao
Hang Zhao
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Multimodal Knowledge Expansion"
7 / 7 papers shown
Title
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
16
16
0
08 Aug 2023
TinyMIM: An Empirical Study of Distilling MIM Pre-trained Models
Sucheng Ren
Fangyun Wei
Zheng-Wei Zhang
Han Hu
22
34
0
03 Jan 2023
Training-Free Robust Multimodal Learning via Sample-Wise Jacobian Regularization
Zhengqi Gao
Sucheng Ren
Zihui Xue
Siting Li
Hang Zhao
19
3
0
05 Apr 2022
Co-advise: Cross Inductive Bias Distillation
Sucheng Ren
Zhengqi Gao
Tianyu Hua
Zihui Xue
Yonglong Tian
Shengfeng He
Hang Zhao
40
53
0
23 Jun 2021
Meta Pseudo Labels
Hieu H. Pham
Zihang Dai
Qizhe Xie
Minh-Thang Luong
Quoc V. Le
VLM
245
655
0
23 Mar 2020
Confidence Regularized Self-Training
Yang Zou
Zhiding Yu
Xiaofeng Liu
B. Kumar
Jinsong Wang
216
787
0
26 Aug 2019
Mean teachers are better role models: Weight-averaged consistency targets improve semi-supervised deep learning results
Antti Tarvainen
Harri Valpola
OOD
MoMe
244
1,276
0
06 Mar 2017
1