ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2102.06635
  4. Cited By
ReLU Neural Networks of Polynomial Size for Exact Maximum Flow
  Computation

ReLU Neural Networks of Polynomial Size for Exact Maximum Flow Computation

12 February 2021
Christoph Hertrich
Leon Sering
ArXivPDFHTML

Papers citing "ReLU Neural Networks of Polynomial Size for Exact Maximum Flow Computation"

3 / 3 papers shown
Title
Looped ReLU MLPs May Be All You Need as Practical Programmable Computers
Looped ReLU MLPs May Be All You Need as Practical Programmable Computers
Yingyu Liang
Zhizhou Sha
Zhenmei Shi
Zhao-quan Song
Yufa Zhou
73
17
0
21 Feb 2025
Training Fully Connected Neural Networks is $\exists\mathbb{R}$-Complete
Training Fully Connected Neural Networks is ∃R\exists\mathbb{R}∃R-Complete
Daniel Bertschinger
Christoph Hertrich
Paul Jungeblut
Tillmann Miltzow
Simon Weber
OffRL
40
30
0
04 Apr 2022
Benefits of depth in neural networks
Benefits of depth in neural networks
Matus Telgarsky
123
600
0
14 Feb 2016
1