Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2208.08191
Cited By
Transformer Vs. MLP-Mixer: Exponential Expressive Gap For NLP Problems
17 August 2022
D. Navon
A. Bronstein
MoE
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Transformer Vs. MLP-Mixer: Exponential Expressive Gap For NLP Problems"
3 / 3 papers shown
Title
MLP-Mixer: An all-MLP Architecture for Vision
Ilya O. Tolstikhin
N. Houlsby
Alexander Kolesnikov
Lucas Beyer
Xiaohua Zhai
...
Andreas Steiner
Daniel Keysers
Jakob Uszkoreit
Mario Lucic
Alexey Dosovitskiy
239
2,592
0
04 May 2021
Aggregated Residual Transformations for Deep Neural Networks
Saining Xie
Ross B. Girshick
Piotr Dollár
Z. Tu
Kaiming He
261
10,196
0
16 Nov 2016
Densely Connected Convolutional Networks
Gao Huang
Zhuang Liu
L. V. D. van der Maaten
Kilian Q. Weinberger
PINN
3DV
247
36,237
0
25 Aug 2016
1