ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2202.04925
  4. Cited By
Decomposing neural networks as mappings of correlation functions
v1v2 (latest)

Decomposing neural networks as mappings of correlation functions

Physical Review Research (Phys. Rev. Res.), 2022
10 February 2022
Kirsten Fischer
Alexandre René
Christian Keup
Moritz Layer
David Dahmen
M. Helias
    FAtt
ArXiv (abs)PDFHTML

Papers citing "Decomposing neural networks as mappings of correlation functions"

8 / 8 papers shown
Symmetry and Generalisation in Neural Approximations of Renormalisation Transformations
Symmetry and Generalisation in Neural Approximations of Renormalisation Transformations
Cassidy Ashworth
Pietro Lio
Francesco Caso
AI4CE
109
0
0
18 Oct 2025
Probing Geometry of Next Token Prediction Using Cumulant Expansion of the Softmax Entropy
Probing Geometry of Next Token Prediction Using Cumulant Expansion of the Softmax Entropy
Karthik Viswanathan
Sang Eon Park
101
0
0
05 Oct 2025
Flat Channels to Infinity in Neural Loss Landscapes
Flat Channels to Infinity in Neural Loss Landscapes
Flavio Martinelli
Alexander Van Meegen
Berfin Simsek
W. Gerstner
Johanni Brea
295
2
0
17 Jun 2025
Neural Network Field Theories: Non-Gaussianity, Actions, and Locality
Neural Network Field Theories: Non-Gaussianity, Actions, and Locality
M. Demirtaş
James Halverson
Anindita Maiti
M. Schwartz
Keegan Stoner
AI4CE
222
20
0
06 Jul 2023
Bayes-optimal Learning of Deep Random Networks of Extensive-width
Bayes-optimal Learning of Deep Random Networks of Extensive-widthInternational Conference on Machine Learning (ICML), 2023
Hugo Cui
Florent Krzakala
Lenka Zdeborová
BDL
360
46
0
01 Feb 2023
Neural networks trained with SGD learn distributions of increasing
  complexity
Neural networks trained with SGD learn distributions of increasing complexityInternational Conference on Machine Learning (ICML), 2022
Maria Refinetti
Alessandro Ingrosso
Sebastian Goldt
UQCV
340
52
0
21 Nov 2022
Origami in N dimensions: How feed-forward networks manufacture linear
  separability
Origami in N dimensions: How feed-forward networks manufacture linear separability
Christian Keup
M. Helias
252
9
0
21 Mar 2022
Separation of Scales and a Thermodynamic Description of Feature Learning
  in Some CNNs
Separation of Scales and a Thermodynamic Description of Feature Learning in Some CNNsNature Communications (Nat Commun), 2021
Inbar Seroussi
Gadi Naveh
Zohar Ringel
347
64
0
31 Dec 2021
1