ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2111.08117
  4. Cited By
Neural networks with linear threshold activations: structure and
  algorithms

Neural networks with linear threshold activations: structure and algorithms

15 November 2021
Sammy Khalife
Hongyu Cheng
A. Basu
ArXivPDFHTML

Papers citing "Neural networks with linear threshold activations: structure and algorithms"

4 / 4 papers shown
Title
When Deep Learning Meets Polyhedral Theory: A Survey
When Deep Learning Meets Polyhedral Theory: A Survey
Joey Huchette
Gonzalo Muñoz
Thiago Serra
Calvin Tsay
AI4CE
84
32
0
29 Apr 2023
Lower Bounds on the Depth of Integral ReLU Neural Networks via Lattice
  Polytopes
Lower Bounds on the Depth of Integral ReLU Neural Networks via Lattice Polytopes
Christian Haase
Christoph Hertrich
Georg Loho
13
21
0
24 Feb 2023
Training Fully Connected Neural Networks is $\exists\mathbb{R}$-Complete
Training Fully Connected Neural Networks is ∃R\exists\mathbb{R}∃R-Complete
Daniel Bertschinger
Christoph Hertrich
Paul Jungeblut
Tillmann Miltzow
Simon Weber
OffRL
40
30
0
04 Apr 2022
Benefits of depth in neural networks
Benefits of depth in neural networks
Matus Telgarsky
121
600
0
14 Feb 2016
1