ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2006.14798
78
9
v1v2v3 (latest)

Training Convolutional ReLU Neural Networks in Polynomial Time: Exact Convex Optimization Formulations

26 June 2020
Tolga Ergen
Mert Pilanci
ArXiv (abs)PDFHTML
Abstract

We study training of Convolutional Neural Networks (CNNs) with ReLU activations and introduce exact convex optimization formulations with a polynomial complexity with respect to the number of data samples, the number of neurons and data dimension. Particularly, we develop a convex analytic framework utilizing semi-infinite duality to obtain equivalent convex optimization problems for several CNN architectures. We first prove that two-layer CNNs can be globally optimized via an ℓ2\ell_2ℓ2​ norm regularized convex program. We then show that certain three-layer CNN training problems are equivalent to an ℓ1\ell_1ℓ1​ regularized convex program. We also extend these results to multi-layer CNN architectures. Furthermore, we present extensions of our approach to different pooling methods.

View on arXiv
Comments on this paper