ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1611.00740
14
573

Why and When Can Deep -- but Not Shallow -- Networks Avoid the Curse of Dimensionality: a Review

2 November 2016
T. Poggio
H. Mhaskar
Lorenzo Rosasco
Brando Miranda
Q. Liao
ArXivPDFHTML
Abstract

The paper characterizes classes of functions for which deep learning can be exponentially better than shallow learning. Deep convolutional networks are a special case of these conditions, though weight sharing is not the main reason for their exponential advantage.

View on arXiv
Comments on this paper