ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2204.01368
  4. Cited By
Training Fully Connected Neural Networks is $\exists\mathbb{R}$-Complete

Training Fully Connected Neural Networks is ∃R\exists\mathbb{R}∃R-Complete

4 April 2022
Daniel Bertschinger
Christoph Hertrich
Paul Jungeblut
Tillmann Miltzow
Simon Weber
    OffRL
ArXivPDFHTML

Papers citing "Training Fully Connected Neural Networks is $\exists\mathbb{R}$-Complete"

16 / 16 papers shown
Title
Logical perspectives on learning statistical objects
Logical perspectives on learning statistical objects
Aaron Anderson
Michael Benedikt
49
0
0
01 Apr 2025
On the Expressiveness of Rational ReLU Neural Networks With Bounded Depth
Gennadiy Averkov
Christopher Hojny
Maximilian Merkert
73
3
0
10 Feb 2025
On the Complexity of Identification in Linear Structural Causal Models
On the Complexity of Identification in Linear Structural Causal Models
Julian Dörfler
Benito van der Zander
Markus Bläser
Maciej Liskiewicz
CML
18
0
0
17 Jul 2024
Graph Neural Networks and Arithmetic Circuits
Graph Neural Networks and Arithmetic Circuits
Timon Barlag
Vivian Holzapfel
Laura Strieker
Jonni Virtema
H. Vollmer
GNN
22
0
0
27 Feb 2024
Polynomial-Time Solutions for ReLU Network Training: A Complexity
  Classification via Max-Cut and Zonotopes
Polynomial-Time Solutions for ReLU Network Training: A Complexity Classification via Max-Cut and Zonotopes
Yifei Wang
Mert Pilanci
11
3
0
18 Nov 2023
Complexity of Neural Network Training and ETR: Extensions with
  Effectively Continuous Functions
Complexity of Neural Network Training and ETR: Extensions with Effectively Continuous Functions
Teemu Hankala
Miika Hannula
J. Kontinen
Jonni Virtema
12
6
0
19 May 2023
When Deep Learning Meets Polyhedral Theory: A Survey
When Deep Learning Meets Polyhedral Theory: A Survey
Joey Huchette
Gonzalo Muñoz
Thiago Serra
Calvin Tsay
AI4CE
84
32
0
29 Apr 2023
Training Neural Networks is NP-Hard in Fixed Dimension
Training Neural Networks is NP-Hard in Fixed Dimension
Vincent Froese
Christoph Hertrich
36
4
0
29 Mar 2023
Lower Bounds on the Depth of Integral ReLU Neural Networks via Lattice
  Polytopes
Lower Bounds on the Depth of Integral ReLU Neural Networks via Lattice Polytopes
Christian Haase
Christoph Hertrich
Georg Loho
11
12
0
24 Feb 2023
Neural networks with linear threshold activations: structure and
  algorithms
Neural networks with linear threshold activations: structure and algorithms
Sammy Khalife
Hongyu Cheng
A. Basu
18
10
0
15 Nov 2021
On Classifying Continuous Constraint Satisfaction Problems
On Classifying Continuous Constraint Satisfaction Problems
Tillmann Miltzow
R. F. Schmiermann
22
19
0
04 Jun 2021
Towards Lower Bounds on the Depth of ReLU Neural Networks
Towards Lower Bounds on the Depth of ReLU Neural Networks
Christoph Hertrich
A. Basu
M. D. Summa
M. Skutella
19
31
0
31 May 2021
The Computational Complexity of ReLU Network Training Parameterized by
  Data Dimensionality
The Computational Complexity of ReLU Network Training Parameterized by Data Dimensionality
Vincent Froese
Christoph Hertrich
R. Niedermeier
8
17
0
18 May 2021
ReLU Neural Networks of Polynomial Size for Exact Maximum Flow
  Computation
ReLU Neural Networks of Polynomial Size for Exact Maximum Flow Computation
Christoph Hertrich
Leon Sering
8
10
0
12 Feb 2021
Provably Good Solutions to the Knapsack Problem via Neural Networks of
  Bounded Size
Provably Good Solutions to the Knapsack Problem via Neural Networks of Bounded Size
Christoph Hertrich
M. Skutella
36
12
0
28 May 2020
Benefits of depth in neural networks
Benefits of depth in neural networks
Matus Telgarsky
121
600
0
14 Feb 2016
1