ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2105.08675
  4. Cited By
The Computational Complexity of ReLU Network Training Parameterized by
  Data Dimensionality

The Computational Complexity of ReLU Network Training Parameterized by Data Dimensionality

18 May 2021
Vincent Froese
Christoph Hertrich
R. Niedermeier
ArXivPDFHTML

Papers citing "The Computational Complexity of ReLU Network Training Parameterized by Data Dimensionality"

3 / 3 papers shown
Title
When Deep Learning Meets Polyhedral Theory: A Survey
When Deep Learning Meets Polyhedral Theory: A Survey
Joey Huchette
Gonzalo Muñoz
Thiago Serra
Calvin Tsay
AI4CE
86
32
0
29 Apr 2023
Lower Bounds on the Depth of Integral ReLU Neural Networks via Lattice
  Polytopes
Lower Bounds on the Depth of Integral ReLU Neural Networks via Lattice Polytopes
Christian Haase
Christoph Hertrich
Georg Loho
16
21
0
24 Feb 2023
Training Fully Connected Neural Networks is $\exists\mathbb{R}$-Complete
Training Fully Connected Neural Networks is ∃R\exists\mathbb{R}∃R-Complete
Daniel Bertschinger
Christoph Hertrich
Paul Jungeblut
Tillmann Miltzow
Simon Weber
OffRL
40
30
0
04 Apr 2022
1