ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2102.13069
  4. Cited By
Proof of the Contiguity Conjecture and Lognormal Limit for the Symmetric
  Perceptron
v1v2 (latest)

Proof of the Contiguity Conjecture and Lognormal Limit for the Symmetric Perceptron

IEEE Annual Symposium on Foundations of Computer Science (FOCS), 2021
25 February 2021
Emmanuel Abbe
Shuangping Li
Allan Sly
ArXiv (abs)PDFHTML

Papers citing "Proof of the Contiguity Conjecture and Lognormal Limit for the Symmetric Perceptron"

12 / 12 papers shown
Title
Binary perceptron computational gap -- a parametric fl RDT view
Binary perceptron computational gap -- a parametric fl RDT view
Mihailo Stojnic
68
1
0
02 Nov 2025
High-dimensional manifold of solutions in neural networks: insights from statistical physics
High-dimensional manifold of solutions in neural networks: insights from statistical physics
Enrico M. Malatesta
274
5
0
20 Feb 2025
Injectivity capacity of ReLU gates
Injectivity capacity of ReLU gates
Mihailo Stojnic
161
3
0
28 Oct 2024
Exact full-RSB SAT/UNSAT transition in infinitely wide two-layer neural networks
Exact full-RSB SAT/UNSAT transition in infinitely wide two-layer neural networksSciPost Physics (SciPost Phys.), 2024
B. Annesi
Enrico M. Malatesta
Francesco Zamponi
227
7
0
09 Oct 2024
Fl RDT based ultimate lowering of the negative spherical perceptron
  capacity
Fl RDT based ultimate lowering of the negative spherical perceptron capacity
M. Stojnic
93
8
0
27 Dec 2023
Binary perceptrons capacity via fully lifted random duality theory
Binary perceptrons capacity via fully lifted random duality theory
M. Stojnic
216
10
0
29 Nov 2023
Typical and atypical solutions in non-convex neural networks with
  discrete and continuous weights
Typical and atypical solutions in non-convex neural networks with discrete and continuous weightsPhysical Review E (PRE), 2023
Carlo Baldassi
Enrico M. Malatesta
Gabriele Perugini
R. Zecchina
MQ
237
21
0
26 Apr 2023
Sharp analysis of EM for learning mixtures of pairwise differences
Sharp analysis of EM for learning mixtures of pairwise differencesAnnual Conference Computational Learning Theory (COLT), 2023
A. Dhawan
Cheng Mao
A. Pananjady
201
1
0
20 Feb 2023
Disordered Systems Insights on Computational Hardness
Disordered Systems Insights on Computational HardnessJournal of Statistical Mechanics: Theory and Experiment (JSTAT), 2022
D. Gamarnik
Cristopher Moore
Lenka Zdeborová
AI4CE
214
52
0
15 Oct 2022
Equivalence between algorithmic instability and transition to replica
  symmetry breaking in perceptron learning systems
Equivalence between algorithmic instability and transition to replica symmetry breaking in perceptron learning systemsPhysical Review Research (Phys. Rev. Res.), 2021
Yang Zhao
Junbin Qiu
Mingshan Xie
Haiping Huang
215
4
0
26 Nov 2021
Binary perceptron: efficient algorithms can find solutions in a rare
  well-connected cluster
Binary perceptron: efficient algorithms can find solutions in a rare well-connected clusterSymposium on the Theory of Computing (STOC), 2021
Emmanuel Abbe
Shuangping Li
Allan Sly
MQ
165
41
0
04 Nov 2021
Learning through atypical "phase transitions" in overparameterized
  neural networks
Learning through atypical "phase transitions" in overparameterized neural networks
Carlo Baldassi
Clarissa Lauditi
Enrico M. Malatesta
R. Pacelli
Gabriele Perugini
R. Zecchina
276
31
0
01 Oct 2021
1