Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2202.03841
Cited By
v1
v2 (latest)
Width is Less Important than Depth in ReLU Neural Networks
8 February 2022
Gal Vardi
Gilad Yehudai
Ohad Shamir
3DV
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Width is Less Important than Depth in ReLU Neural Networks"
5 / 5 papers shown
Title
Leaner Transformers: More Heads, Less Depth
Hemanth Saratchandran
Damien Teney
Simon Lucey
24
0
0
27 May 2025
The Role of Depth, Width, and Tree Size in Expressiveness of Deep Forest
Shen-Huan Lyu
Jin-Hui Wu
Qin-Cheng Zheng
Baoliu Ye
95
0
0
06 Jul 2024
Data Topology-Dependent Upper Bounds of Neural Network Widths
Sangmin Lee
Jong Chul Ye
72
1
0
25 May 2023
Multi-Path Transformer is Better: A Case Study on Neural Machine Translation
Ye Lin
Shuhan Zhou
Yanyang Li
Anxiang Ma
Tong Xiao
Jingbo Zhu
67
0
0
10 May 2023
Exponential Separations in Symmetric Neural Networks
Aaron Zweig
Joan Bruna
94
9
0
02 Jun 2022
1