Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
All Papers
0 / 0 papers shown
Home
Papers
2105.08050
Cited By
v1
v2 (latest)
Pay Attention to MLPs
Neural Information Processing Systems (NeurIPS), 2021
17 May 2021
Hanxiao Liu
Zihang Dai
David R. So
Quoc V. Le
AI4CE
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Pay Attention to MLPs"
23 / 323 papers shown
CycleMLP: A MLP-like Architecture for Dense Prediction
International Conference on Learning Representations (ICLR), 2021
Shoufa Chen
Enze Xie
Chongjian Ge
Runjian Chen
Ding Liang
Ping Luo
339
251
0
21 Jul 2021
AS-MLP: An Axial Shifted MLP Architecture for Vision
International Conference on Learning Representations (ICLR), 2021
Dongze Lian
Zehao Yu
Xing Sun
Shenghua Gao
214
211
0
18 Jul 2021
Visual Transformer with Statistical Test for COVID-19 Classification
Chih-Chung Hsu
Guan-Lin Chen
Mei-Hsuan Wu
ViT
MedIm
153
16
0
12 Jul 2021
What Makes for Hierarchical Vision Transformer?
Yuxin Fang
Xinggang Wang
Rui Wu
Wenyu Liu
ViT
127
11
0
05 Jul 2021
Global Filter Networks for Image Classification
Yongming Rao
Wenliang Zhao
Zheng Zhu
Jiwen Lu
Jie Zhou
ViT
284
597
0
01 Jul 2021
Multi-Exit Vision Transformer for Dynamic Inference
British Machine Vision Conference (BMVC), 2021
Arian Bakhtiarnia
Tao Gui
Alexandros Iosifidis
267
27
0
29 Jun 2021
Rethinking Token-Mixing MLP for MLP-based Vision Backbone
British Machine Vision Conference (BMVC), 2021
Tan Yu
Xu Li
Yunfeng Cai
Mingming Sun
Ping Li
190
27
0
28 Jun 2021
Vision Permutator: A Permutable MLP-Like Architecture for Visual Recognition
Qibin Hou
Zihang Jiang
Li-xin Yuan
Mingg-Ming Cheng
Shuicheng Yan
Jiashi Feng
ViT
MLLM
272
234
0
23 Jun 2021
Towards Biologically Plausible Convolutional Networks
Neural Information Processing Systems (NeurIPS), 2021
Roman Pogodin
Yash Mehta
Timothy Lillicrap
P. Latham
286
24
0
22 Jun 2021
MLP Singer: Towards Rapid Parallel Korean Singing Voice Synthesis
International Workshop on Machine Learning for Signal Processing (MLSP), 2021
Jaesung Tae
Hyeongju Kim
Younggun Lee
165
16
0
15 Jun 2021
S
2
^2
2
-MLP: Spatial-Shift MLP Architecture for Vision
IEEE Workshop/Winter Conference on Applications of Computer Vision (WACV), 2021
Tan Yu
Xu Li
Yunfeng Cai
Mingming Sun
Ping Li
236
215
0
14 Jun 2021
On the Connection between Local Attention and Dynamic Depth-wise Convolution
International Conference on Learning Representations (ICLR), 2021
Qi Han
Zejia Fan
Jingdong Sun
Lei-huan Sun
Ming-Ming Cheng
Jiaying Liu
Jingdong Wang
ViT
336
131
0
08 Jun 2021
A Lightweight and Gradient-Stable Neural Layer
Neural Networks (NN), 2021
Yueyao Yu
Yin Zhang
477
1
0
08 Jun 2021
Vision Transformers with Hierarchical Attention
Machine Intelligence Research (MIR), 2021
Yun-Hai Liu
Yu-Huan Wu
Guolei Sun
Le Zhang
Ajad Chhatkuli
Luc Van Gool
ViT
167
70
0
06 Jun 2021
When Vision Transformers Outperform ResNets without Pre-training or Strong Data Augmentations
International Conference on Learning Representations (ICLR), 2021
Xiangning Chen
Cho-Jui Hsieh
Boqing Gong
ViT
367
373
0
03 Jun 2021
An Attention Free Transformer
Shuangfei Zhai
Walter A. Talbott
Nitish Srivastava
Chen Huang
Hanlin Goh
Ruixiang Zhang
J. Susskind
ViT
368
161
0
28 May 2021
ResMLP: Feedforward networks for image classification with data-efficient training
IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), 2021
Hugo Touvron
Piotr Bojanowski
Mathilde Caron
Matthieu Cord
Alaaeldin El-Nouby
...
Gautier Izacard
Armand Joulin
Gabriel Synnaeve
Jakob Verbeek
Edouard Grave
VLM
484
796
0
07 May 2021
RepMLP: Re-parameterizing Convolutions into Fully-connected Layers for Image Recognition
Xiaohan Ding
Chunlong Xia
Xinming Zhang
Xiaojie Chu
Jungong Han
Guiguang Ding
223
108
0
05 May 2021
A Practical Survey on Faster and Lighter Transformers
ACM Computing Surveys (CSUR), 2021
Quentin Fournier
G. Caron
Daniel Aloise
362
132
0
26 Mar 2021
Can Vision Transformers Learn without Natural Images?
AAAI Conference on Artificial Intelligence (AAAI), 2021
Kodai Nakashima
Hirokatsu Kataoka
Asato Matsumoto
K. Iwata
Nakamasa Inoue
ViT
106
41
0
24 Mar 2021
Red Alarm for Pre-trained Models: Universal Vulnerability to Neuron-Level Backdoor Attacks
Machine Intelligence Research (MIR), 2021
Zhengyan Zhang
Guangxuan Xiao
Yongwei Li
Tian Lv
Fanchao Qi
Zhiyuan Liu
Yasheng Wang
Xin Jiang
Maosong Sun
AAML
297
82
0
18 Jan 2021
Not all parameters are born equal: Attention is mostly what you need
Nikolay Bogoychev
MoE
161
9
0
22 Oct 2020
Efficient Transformers: A Survey
ACM Computing Surveys (ACM CSUR), 2020
Yi Tay
Mostafa Dehghani
Dara Bahri
Donald Metzler
VLM
834
1,344
0
14 Sep 2020
Previous
1
2
3
4
5
6
7