ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2010.07587
13
8

Depth-Width Trade-offs for Neural Networks via Topological Entropy

15 October 2020
Kaifeng Bu
Yaobo Zhang
Qingxian Luo
ArXivPDFHTML
Abstract

One of the central problems in the study of deep learning theory is to understand how the structure properties, such as depth, width and the number of nodes, affect the expressivity of deep neural networks. In this work, we show a new connection between the expressivity of deep neural networks and topological entropy from dynamical system, which can be used to characterize depth-width trade-offs of neural networks. We provide an upper bound on the topological entropy of neural networks with continuous semi-algebraic units by the structure parameters. Specifically, the topological entropy of ReLU network with lll layers and mmm nodes per layer is upper bounded by O(llog⁡m)O(l\log m)O(llogm). Besides, if the neural network is a good approximation of some function fff, then the size of the neural network has an exponential lower bound with respect to the topological entropy of fff. Moreover, we discuss the relationship between topological entropy, the number of oscillations, periods and Lipschitz constant.

View on arXiv
Comments on this paper