73
3

On the Expressiveness of Rational ReLU Neural Networks With Bounded Depth

Abstract

To confirm that the expressive power of ReLU neural networks grows with their depth, the function Fn=max{0,x1,,xn}F_n = \max \{0,x_1,\ldots,x_n\} has been considered in the literature. A conjecture by Hertrich, Basu, Di Summa, and Skutella [NeurIPS 2021] states that any ReLU network that exactly represents FnF_n has at least log2(n+1)\lceil\log_2 (n+1)\rceil hidden layers. The conjecture has recently been confirmed for networks with integer weights by Haase, Hertrich, and Loho [ICLR 2023].We follow up on this line of research and show that, within ReLU networks whose weights are decimal fractions, FnF_n can only be represented by networks with at least log3(n+1)\lceil\log_3 (n+1)\rceil hidden layers. Moreover, if all weights are NN-ary fractions, then FnF_n can only be represented by networks with at least Ω(lnnlnlnN)\Omega( \frac{\ln n}{\ln \ln N}) layers. These results are a partial confirmation of the above conjecture for rational ReLU networks, and provide the first non-constant lower bound on the depth of practically relevant ReLU networks.

View on arXiv
@article{averkov2025_2502.06283,
  title={ On the Expressiveness of Rational ReLU Neural Networks With Bounded Depth },
  author={ Gennadiy Averkov and Christopher Hojny and Maximilian Merkert },
  journal={arXiv preprint arXiv:2502.06283},
  year={ 2025 }
}
Comments on this paper