Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2106.12378
Cited By
Co-advise: Cross Inductive Bias Distillation
23 June 2021
Sucheng Ren
Zhengqi Gao
Tianyu Hua
Zihui Xue
Yonglong Tian
Shengfeng He
Hang Zhao
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Co-advise: Cross Inductive Bias Distillation"
7 / 7 papers shown
Title
Rethinking Multi-view Representation Learning via Distilled Disentangling
Guanzhou Ke
Bo Wang
Xiaoli Wang
Shengfeng He
32
3
0
16 Mar 2024
Unsupervised Multi-modal Feature Alignment for Time Series Representation Learning
Cheng Liang
Donghua Yang
Zhiyu Liang
Hongzhi Wang
Zheng Liang
Xiyang Zhang
Jianfeng Huang
AI4TS
85
1
0
09 Dec 2023
Efficiency 360: Efficient Vision Transformers
Badri N. Patro
Vijay Srinivas Agneeswaran
19
6
0
16 Feb 2023
TinyMIM: An Empirical Study of Distilling MIM Pre-trained Models
Sucheng Ren
Fangyun Wei
Zheng-Wei Zhang
Han Hu
22
34
0
03 Jan 2023
Shunted Self-Attention via Multi-Scale Token Aggregation
Sucheng Ren
Daquan Zhou
Shengfeng He
Jiashi Feng
Xinchao Wang
ViT
25
222
0
30 Nov 2021
CMT: Convolutional Neural Networks Meet Vision Transformers
Jianyuan Guo
Kai Han
Han Wu
Yehui Tang
Chunjing Xu
Yunhe Wang
Chang Xu
ViT
337
500
0
13 Jul 2021
MLP-Mixer: An all-MLP Architecture for Vision
Ilya O. Tolstikhin
N. Houlsby
Alexander Kolesnikov
Lucas Beyer
Xiaohua Zhai
...
Andreas Steiner
Daniel Keysers
Jakob Uszkoreit
Mario Lucic
Alexey Dosovitskiy
239
2,592
0
04 May 2021
1