Communities
Connect sessions
AI calendar
Organizations
Contact Sales
Search
Open menu
Home
Papers
All Papers
Title
Home
Papers
2110.06482
Cited By
v1
v2
v3 (latest)
Parallel Deep Neural Networks Have Zero Duality Gap
13 October 2021
Yifei Wang
Tolga Ergen
Mert Pilanci
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Parallel Deep Neural Networks Have Zero Duality Gap"
6 / 6 papers shown
Title
Convex Relaxations of ReLU Neural Networks Approximate Global Optima in Polynomial Time
Sungyoon Kim
Mert Pilanci
317
6
0
06 Feb 2024
ReLU Neural Networks with Linear Layers are Biased Towards Single- and Multi-Index Models
Suzanna Parkinson
Greg Ongie
Rebecca Willett
266
6
0
24 May 2023
Fast Convex Optimization for Two-Layer ReLU Networks: Equivalent Model Classes and Cone Decompositions
Aaron Mishkin
Arda Sahiner
Mert Pilanci
OffRL
312
32
0
02 Feb 2022
Path Regularization: A Convexity and Sparsity Inducing Regularization for Parallel ReLU Networks
Tolga Ergen
Mert Pilanci
249
17
0
18 Oct 2021
Demystifying Batch Normalization in ReLU Networks: Equivalent Convex Optimization Models and Implicit Regularization
Tolga Ergen
Arda Sahiner
Batu Mehmet Ozturkler
John M. Pauly
Morteza Mardani
Mert Pilanci
216
32
0
02 Mar 2021
Xception: Deep Learning with Depthwise Separable Convolutions
François Chollet
MDE
BDL
PINN
2.2K
15,350
0
07 Oct 2016
1