ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1611.09448
19
1

The Upper Bound on Knots in Neural Networks

29 November 2016
Kevin K. Chen
ArXivPDFHTML
Abstract

Neural networks with rectified linear unit activations are essentially multivariate linear splines. As such, one of many ways to measure the "complexity" or "expressivity" of a neural network is to count the number of knots in the spline model. We study the number of knots in fully-connected feedforward neural networks with rectified linear unit activation functions. We intentionally keep the neural networks very simple, so as to make theoretical analyses more approachable. An induction on the number of layers lll reveals a tight upper bound on the number of knots in R→Rp\mathbb{R} \to \mathbb{R}^pR→Rp deep neural networks. With ni≫1n_i \gg 1ni​≫1 neurons in layer i=1,…,li = 1, \dots, li=1,…,l, the upper bound is approximately n1…nln_1 \dots n_ln1​…nl​. We then show that the exact upper bound is tight, and we demonstrate the upper bound with an example. The purpose of these analyses is to pave a path for understanding the behavior of general Rq→Rp\mathbb{R}^q \to \mathbb{R}^pRq→Rp neural networks.

View on arXiv
Comments on this paper