ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2010.07587
  4. Cited By
Depth-Width Trade-offs for Neural Networks via Topological Entropy

Depth-Width Trade-offs for Neural Networks via Topological Entropy

15 October 2020
Kaifeng Bu
Yaobo Zhang
Qingxian Luo
ArXivPDFHTML

Papers citing "Depth-Width Trade-offs for Neural Networks via Topological Entropy"

6 / 6 papers shown
Title
Data Topology-Dependent Upper Bounds of Neural Network Widths
Data Topology-Dependent Upper Bounds of Neural Network Widths
Sangmin Lee
Jong Chul Ye
30
0
0
25 May 2023
Deep Architecture Connectivity Matters for Its Convergence: A
  Fine-Grained Analysis
Deep Architecture Connectivity Matters for Its Convergence: A Fine-Grained Analysis
Wuyang Chen
Wei Huang
Xinyu Gong
Boris Hanin
Zhangyang Wang
40
7
0
11 May 2022
Expressivity of Neural Networks via Chaotic Itineraries beyond
  Sharkovsky's Theorem
Expressivity of Neural Networks via Chaotic Itineraries beyond Sharkovsky's Theorem
Clayton Sanford
Vaggos Chatziafratis
16
1
0
19 Oct 2021
Depth separation beyond radial functions
Depth separation beyond radial functions
Luca Venturi
Samy Jelassi
Tristan Ozuch
Joan Bruna
30
15
0
02 Feb 2021
On the statistical complexity of quantum circuits
On the statistical complexity of quantum circuits
Kaifeng Bu
D. E. Koh
Lu Li
Qingxian Luo
Yaobo Zhang
53
43
0
15 Jan 2021
Benefits of depth in neural networks
Benefits of depth in neural networks
Matus Telgarsky
179
604
0
14 Feb 2016
1