ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1909.11837
  4. Cited By
Mildly Overparametrized Neural Nets can Memorize Training Data
  Efficiently

Mildly Overparametrized Neural Nets can Memorize Training Data Efficiently

26 September 2019
Rong Ge
Runzhe Wang
Haoyu Zhao
    TDI
ArXivPDFHTML

Papers citing "Mildly Overparametrized Neural Nets can Memorize Training Data Efficiently"

5 / 5 papers shown
Title
\emph{Lifted} RDT based capacity analysis of the 1-hidden layer treelike
  \emph{sign} perceptrons neural networks
\emph{Lifted} RDT based capacity analysis of the 1-hidden layer treelike \emph{sign} perceptrons neural networks
M. Stojnic
20
1
0
13 Dec 2023
Capacity of the treelike sign perceptrons neural networks with one
  hidden layer -- RDT based upper bounds
Capacity of the treelike sign perceptrons neural networks with one hidden layer -- RDT based upper bounds
M. Stojnic
16
4
0
13 Dec 2023
Learning Parities with Neural Networks
Learning Parities with Neural Networks
Amit Daniely
Eran Malach
13
76
0
18 Feb 2020
Memory capacity of neural networks with threshold and ReLU activations
Memory capacity of neural networks with threshold and ReLU activations
Roman Vershynin
13
21
0
20 Jan 2020
Active Subspace of Neural Networks: Structural Analysis and Universal
  Attacks
Active Subspace of Neural Networks: Structural Analysis and Universal Attacks
Chunfeng Cui
Kaiqi Zhang
Talgat Daulbaev
Julia Gusak
Ivan V. Oseledets
Zheng-Wei Zhang
AAML
22
25
0
29 Oct 2019
1