ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2201.08652
  4. Cited By
A phase transition for finding needles in nonlinear haystacks with LASSO
  artificial neural networks

A phase transition for finding needles in nonlinear haystacks with LASSO artificial neural networks

21 January 2022
Xiaoyu Ma
S. Sardy
N. Hengartner
Nikolai Bobenko
Yen Ting Lin
ArXivPDFHTML

Papers citing "A phase transition for finding needles in nonlinear haystacks with LASSO artificial neural networks"

6 / 6 papers shown
Title
Training a neural netwok for data reduction and better generalization
Training a neural netwok for data reduction and better generalization
S. Sardy
Maxime van Cutsem
Xiaoyu Ma
MLT
62
0
0
26 Nov 2024
Statistical Guarantees for Approximate Stationary Points of Simple
  Neural Networks
Statistical Guarantees for Approximate Stationary Points of Simple Neural Networks
Mahsa Taheri
Fang Xie
Johannes Lederer
11
0
0
09 May 2022
Consistent Sparse Deep Learning: Theory and Computation
Consistent Sparse Deep Learning: Theory and Computation
Y. Sun
Qifan Song
F. Liang
BDL
25
28
0
25 Feb 2021
Truly Sparse Neural Networks at Scale
Truly Sparse Neural Networks at Scale
Selima Curci
D. Mocanu
Mykola Pechenizkiy
18
19
0
02 Feb 2021
Deep Neural Networks Are Effective At Learning High-Dimensional
  Hilbert-Valued Functions From Limited Data
Deep Neural Networks Are Effective At Learning High-Dimensional Hilbert-Valued Functions From Limited Data
Ben Adcock
Simone Brugiapaglia
N. Dexter
S. Moraga
21
29
0
11 Dec 2020
Improving neural networks by preventing co-adaptation of feature
  detectors
Improving neural networks by preventing co-adaptation of feature detectors
Geoffrey E. Hinton
Nitish Srivastava
A. Krizhevsky
Ilya Sutskever
Ruslan Salakhutdinov
VLM
243
7,597
0
03 Jul 2012
1