Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2203.03691
Cited By
HyperMixer: An MLP-based Low Cost Alternative to Transformers
7 March 2022
Florian Mai
Arnaud Pannatier
Fabio Fehr
Haolin Chen
François Marelli
F. Fleuret
James Henderson
Re-assign community
ArXiv
PDF
HTML
Papers citing
"HyperMixer: An MLP-based Low Cost Alternative to Transformers"
7 / 7 papers shown
Title
Adaptivity and Modularity for Efficient Generalization Over Task Complexity
Samira Abnar
Omid Saremi
Laurent Dinh
Shantel Wilson
Miguel Angel Bautista
...
Vimal Thilak
Etai Littwin
Jiatao Gu
Josh Susskind
Samy Bengio
22
5
0
13 Oct 2023
SummaryMixing: A Linear-Complexity Alternative to Self-Attention for Speech Recognition and Understanding
Titouan Parcollet
Rogier van Dalen
Shucong Zhang
S. Bhattacharya
16
6
0
12 Jul 2023
Expected Validation Performance and Estimation of a Random Variable's Maximum
Jesse Dodge
Suchin Gururangan
Dallas Card
Roy Schwartz
Noah A. Smith
41
9
0
01 Oct 2021
MLP-Mixer: An all-MLP Architecture for Vision
Ilya O. Tolstikhin
N. Houlsby
Alexander Kolesnikov
Lucas Beyer
Xiaohua Zhai
...
Andreas Steiner
Daniel Keysers
Jakob Uszkoreit
Mario Lucic
Alexey Dosovitskiy
239
2,592
0
04 May 2021
LambdaNetworks: Modeling Long-Range Interactions Without Attention
Irwan Bello
260
178
0
17 Feb 2021
A Survey on Recent Approaches for Natural Language Processing in Low-Resource Scenarios
Michael A. Hedderich
Lukas Lange
Heike Adel
Jannik Strötgen
Dietrich Klakow
191
286
0
23 Oct 2020
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
294
6,943
0
20 Apr 2018
1