ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2306.11827
  4. Cited By
Any Deep ReLU Network is Shallow

Any Deep ReLU Network is Shallow

20 June 2023
M. Villani
Nandi Schoots
    FAtt
    OffRL
ArXivPDFHTML

Papers citing "Any Deep ReLU Network is Shallow"

7 / 7 papers shown
Title
Relating Piecewise Linear Kolmogorov Arnold Networks to ReLU Networks
Nandi Schoots
M. Villani
Niels uit de Bos
79
0
0
03 Mar 2025
The structure of the token space for large language models
The structure of the token space for large language models
Michael Robinson
Sourya Dey
Shauna Sweet
24
3
0
11 Oct 2024
RepAct: The Re-parameterizable Adaptive Activation Function
RepAct: The Re-parameterizable Adaptive Activation Function
Xian Wu
Qingchuan Tao
Shuang Wang
34
0
0
28 Jun 2024
The Topos of Transformer Networks
Mattia Jacopo Villani
Peter McBurney
23
0
0
27 Mar 2024
Parallel Algorithms for Exact Enumeration of Deep Neural Network
  Activation Regions
Parallel Algorithms for Exact Enumeration of Deep Neural Network Activation Regions
Sabrina Drammis
Bowen Zheng
Karthik Srinivasan
R. Berwick
Nancy A. Lynch
R. Ajemian
63
0
0
29 Feb 2024
Unwrapping All ReLU Networks
Unwrapping All ReLU Networks
M. Villani
Peter McBurney
FAtt
52
3
0
16 May 2023
When Deep Learning Meets Polyhedral Theory: A Survey
When Deep Learning Meets Polyhedral Theory: A Survey
Joey Huchette
Gonzalo Muñoz
Thiago Serra
Calvin Tsay
AI4CE
91
32
0
29 Apr 2023
1