ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2202.03841
  4. Cited By
Width is Less Important than Depth in ReLU Neural Networks
v1v2 (latest)

Width is Less Important than Depth in ReLU Neural Networks

8 February 2022
Gal Vardi
Gilad Yehudai
Ohad Shamir
    3DV
ArXiv (abs)PDFHTML

Papers citing "Width is Less Important than Depth in ReLU Neural Networks"

5 / 5 papers shown
Title
Leaner Transformers: More Heads, Less Depth
Leaner Transformers: More Heads, Less Depth
Hemanth Saratchandran
Damien Teney
Simon Lucey
24
0
0
27 May 2025
The Role of Depth, Width, and Tree Size in Expressiveness of Deep Forest
The Role of Depth, Width, and Tree Size in Expressiveness of Deep Forest
Shen-Huan Lyu
Jin-Hui Wu
Qin-Cheng Zheng
Baoliu Ye
95
0
0
06 Jul 2024
Data Topology-Dependent Upper Bounds of Neural Network Widths
Data Topology-Dependent Upper Bounds of Neural Network Widths
Sangmin Lee
Jong Chul Ye
72
1
0
25 May 2023
Multi-Path Transformer is Better: A Case Study on Neural Machine
  Translation
Multi-Path Transformer is Better: A Case Study on Neural Machine Translation
Ye Lin
Shuhan Zhou
Yanyang Li
Anxiang Ma
Tong Xiao
Jingbo Zhu
67
0
0
10 May 2023
Exponential Separations in Symmetric Neural Networks
Exponential Separations in Symmetric Neural Networks
Aaron Zweig
Joan Bruna
94
9
0
02 Jun 2022
1