ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1902.11294
11
4

A lattice-based approach to the expressivity of deep ReLU neural networks

28 February 2019
V. Corlay
J. Boutros
P. Ciblat
L. Brunel
ArXivPDFHTML
Abstract

We present new families of continuous piecewise linear (CPWL) functions in Rn having a number of affine pieces growing exponentially in nnn. We show that these functions can be seen as the high-dimensional generalization of the triangle wave function used by Telgarsky in 2016. We prove that they can be computed by ReLU networks with quadratic depth and linear width in the space dimension. We also investigate the approximation error of one of these functions by shallower networks and prove a separation result. The main difference between our functions and other constructions is their practical interest: they arise in the scope of channel coding. Hence, computing such functions amounts to performing a decoding operation.

View on arXiv
Comments on this paper