Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2001.06938
Cited By
Memory capacity of neural networks with threshold and ReLU activations
20 January 2020
Roman Vershynin
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Memory capacity of neural networks with threshold and ReLU activations"
18 / 18 papers shown
Title
Memorization Capacity for Additive Fine-Tuning with Small ReLU Networks
Jy-yong Sohn
Dohyun Kwon
Seoyeon An
Kangwook Lee
30
0
0
01 Aug 2024
Fixed width treelike neural networks capacity analysis -- generic activations
M. Stojnic
20
3
0
08 Feb 2024
\emph{Lifted} RDT based capacity analysis of the 1-hidden layer treelike \emph{sign} perceptrons neural networks
M. Stojnic
20
1
0
13 Dec 2023
Capacity of the treelike sign perceptrons neural networks with one hidden layer -- RDT based upper bounds
M. Stojnic
16
4
0
13 Dec 2023
Memorization with neural nets: going beyond the worst case
S. Dirksen
Patrick Finke
Martin Genzel
21
0
0
30 Sep 2023
Memorization Capacity of Multi-Head Attention in Transformers
Sadegh Mahdavi
Renjie Liao
Christos Thrampoulidis
24
22
0
03 Jun 2023
Globally Optimal Training of Neural Networks with Threshold Activation Functions
Tolga Ergen
Halil Ibrahim Gulluk
Jonathan Lacotte
Mert Pilanci
60
8
0
06 Mar 2023
Task Discovery: Finding the Tasks that Neural Networks Generalize on
Andrei Atanov
Andrei Filatov
Teresa Yeo
Ajay Sohmshetty
Amir Zamir
OOD
38
10
0
01 Dec 2022
On the Optimal Memorization Power of ReLU Neural Networks
Gal Vardi
Gilad Yehudai
Ohad Shamir
6
31
0
07 Oct 2021
Statistically Meaningful Approximation: a Case Study on Approximating Turing Machines with Transformers
Colin Wei
Yining Chen
Tengyu Ma
8
87
0
28 Jul 2021
Provable Memorization via Deep Neural Networks using Sub-linear Parameters
Sejun Park
Jaeho Lee
Chulhee Yun
Jinwoo Shin
FedML
MDE
12
36
0
26 Oct 2020
Random Vector Functional Link Networks for Function Approximation on Manifolds
Deanna Needell
Aaron A. Nelson
Rayan Saab
Palina Salanevich
Olov Schavemaker
8
28
0
30 Jul 2020
The Interpolation Phase Transition in Neural Networks: Memorization and Generalization under Lazy Training
Andrea Montanari
Yiqiao Zhong
20
95
0
25 Jul 2020
Universal Approximation Power of Deep Residual Neural Networks via Nonlinear Control Theory
Paulo Tabuada
Bahman Gharesifard
6
24
0
12 Jul 2020
Minimum Width for Universal Approximation
Sejun Park
Chulhee Yun
Jaeho Lee
Jinwoo Shin
25
121
0
16 Jun 2020
Approximation in shift-invariant spaces with deep ReLU neural networks
Yunfei Yang
Zhen Li
Yang Wang
13
14
0
25 May 2020
Global Convergence of Deep Networks with One Wide Layer Followed by Pyramidal Topology
Quynh N. Nguyen
Marco Mondelli
ODL
AI4CE
11
67
0
18 Feb 2020
NEU: A Meta-Algorithm for Universal UAP-Invariant Feature Representation
Anastasis Kratsios
Cody B. Hyndman
OOD
9
17
0
31 Aug 2018
1