ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2302.12553
  4. Cited By
Lower Bounds on the Depth of Integral ReLU Neural Networks via Lattice
  Polytopes

Lower Bounds on the Depth of Integral ReLU Neural Networks via Lattice Polytopes

24 February 2023
Christian Haase
Christoph Hertrich
Georg Loho
ArXivPDFHTML

Papers citing "Lower Bounds on the Depth of Integral ReLU Neural Networks via Lattice Polytopes"

16 / 16 papers shown
Title
On the Depth of Monotone ReLU Neural Networks and ICNNs
On the Depth of Monotone ReLU Neural Networks and ICNNs
Egor Bakaev
Florestan Brunck
Christoph Hertrich
Daniel Reichman
Amir Yehudayoff
11
0
0
09 May 2025
On the Expressiveness of Rational ReLU Neural Networks With Bounded Depth
Gennadiy Averkov
Christopher Hojny
Maximilian Merkert
78
3
0
10 Feb 2025
Neural Networks and (Virtual) Extended Formulations
Neural Networks and (Virtual) Extended Formulations
Christoph Hertrich
Georg Loho
59
3
0
05 Nov 2024
Computability of Classification and Deep Learning: From Theoretical
  Limits to Practical Feasibility through Quantization
Computability of Classification and Deep Learning: From Theoretical Limits to Practical Feasibility through Quantization
Holger Boche
Vít Fojtík
Adalbert Fono
Gitta Kutyniok
16
0
0
12 Aug 2024
On Minimal Depth in Neural Networks
On Minimal Depth in Neural Networks
J. L. Valerdi
20
3
0
23 Feb 2024
Defining Neural Network Architecture through Polytope Structures of
  Dataset
Defining Neural Network Architecture through Polytope Structures of Dataset
Sangmin Lee
Abbas Mammadov
Jong Chul Ye
43
0
0
04 Feb 2024
Extracting Formulae in Many-Valued Logic from Deep Neural Networks
Extracting Formulae in Many-Valued Logic from Deep Neural Networks
Yani Zhang
Helmut Bölcskei
6
0
0
22 Jan 2024
Topological Expressivity of ReLU Neural Networks
Topological Expressivity of ReLU Neural Networks
Ekin Ergen
Moritz Grillo
43
2
0
17 Oct 2023
How Many Neurons Does it Take to Approximate the Maximum?
How Many Neurons Does it Take to Approximate the Maximum?
Itay Safran
Daniel Reichman
Paul Valiant
23
8
0
18 Jul 2023
Neural Polytopes
Neural Polytopes
Koji Hashimoto
T. Naito
Hisashi Naito
14
1
0
03 Jul 2023
Training Neural Networks is NP-Hard in Fixed Dimension
Training Neural Networks is NP-Hard in Fixed Dimension
Vincent Froese
Christoph Hertrich
36
4
0
29 Mar 2023
Training Fully Connected Neural Networks is $\exists\mathbb{R}$-Complete
Training Fully Connected Neural Networks is ∃R\exists\mathbb{R}∃R-Complete
Daniel Bertschinger
Christoph Hertrich
Paul Jungeblut
Tillmann Miltzow
Simon Weber
OffRL
40
30
0
04 Apr 2022
Towards Lower Bounds on the Depth of ReLU Neural Networks
Towards Lower Bounds on the Depth of ReLU Neural Networks
Christoph Hertrich
A. Basu
M. D. Summa
M. Skutella
24
31
0
31 May 2021
ReLU Neural Networks of Polynomial Size for Exact Maximum Flow
  Computation
ReLU Neural Networks of Polynomial Size for Exact Maximum Flow Computation
Christoph Hertrich
Leon Sering
13
10
0
12 Feb 2021
Provably Good Solutions to the Knapsack Problem via Neural Networks of
  Bounded Size
Provably Good Solutions to the Knapsack Problem via Neural Networks of Bounded Size
Christoph Hertrich
M. Skutella
38
12
0
28 May 2020
Benefits of depth in neural networks
Benefits of depth in neural networks
Matus Telgarsky
121
600
0
14 Feb 2016
1