ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1907.10599
  4. Cited By
A Fine-Grained Spectral Perspective on Neural Networks

A Fine-Grained Spectral Perspective on Neural Networks

24 July 2019
Greg Yang
Hadi Salman
ArXivPDFHTML

Papers citing "A Fine-Grained Spectral Perspective on Neural Networks"

34 / 34 papers shown
Title
Hamiltonian Neural Networks for Robust Out-of-Time Credit Scoring
Hamiltonian Neural Networks for Robust Out-of-Time Credit Scoring
Javier Marín
83
0
0
13 Mar 2025
SHAP values via sparse Fourier representation
SHAP values via sparse Fourier representation
Ali Gorji
Andisheh Amrollahi
A. Krause
FAtt
35
0
0
08 Oct 2024
Tuning Frequency Bias of State Space Models
Tuning Frequency Bias of State Space Models
Annan Yu
Dongwei Lyu
S. H. Lim
Michael W. Mahoney
N. Benjamin Erichson
47
3
0
02 Oct 2024
Neural Redshift: Random Networks are not Random Functions
Neural Redshift: Random Networks are not Random Functions
Damien Teney
A. Nicolicioiu
Valentin Hartmann
Ehsan Abbasnejad
103
18
0
04 Mar 2024
Simplicity bias, algorithmic probability, and the random logistic map
Simplicity bias, algorithmic probability, and the random logistic map
B. Hamzi
K. Dingle
23
3
0
31 Dec 2023
In Search of a Data Transformation That Accelerates Neural Field
  Training
In Search of a Data Transformation That Accelerates Neural Field Training
Junwon Seo
Sangyoon Lee
Kwang In Kim
Jaeho Lee
47
3
0
28 Nov 2023
Double Pessimism is Provably Efficient for Distributionally Robust
  Offline Reinforcement Learning: Generic Algorithm and Robust Partial Coverage
Double Pessimism is Provably Efficient for Distributionally Robust Offline Reinforcement Learning: Generic Algorithm and Robust Partial Coverage
Jose H. Blanchet
Miao Lu
Tong Zhang
Han Zhong
OffRL
45
29
0
16 May 2023
Do deep neural networks have an inbuilt Occam's razor?
Do deep neural networks have an inbuilt Occam's razor?
Chris Mingard
Henry Rees
Guillermo Valle Pérez
A. Louis
UQCV
BDL
21
16
0
13 Apr 2023
On the Stepwise Nature of Self-Supervised Learning
On the Stepwise Nature of Self-Supervised Learning
James B. Simon
Maksis Knutins
Liu Ziyin
Daniel Geisz
Abraham J. Fetterman
Joshua Albrecht
SSL
37
30
0
27 Mar 2023
The SSL Interplay: Augmentations, Inductive Bias, and Generalization
The SSL Interplay: Augmentations, Inductive Bias, and Generalization
Vivien A. Cabannes
B. Kiani
Randall Balestriero
Yann LeCun
A. Bietti
SSL
19
31
0
06 Feb 2023
Width and Depth Limits Commute in Residual Networks
Width and Depth Limits Commute in Residual Networks
Soufiane Hayou
Greg Yang
42
14
0
01 Feb 2023
Characterizing the Spectrum of the NTK via a Power Series Expansion
Characterizing the Spectrum of the NTK via a Power Series Expansion
Michael Murray
Hui Jin
Benjamin Bowman
Guido Montúfar
38
11
0
15 Nov 2022
Spectral Regularization Allows Data-frugal Learning over Combinatorial
  Spaces
Spectral Regularization Allows Data-frugal Learning over Combinatorial Spaces
Amirali Aghazadeh
Nived Rajaraman
Tony Tu
Kannan Ramchandran
22
2
0
05 Oct 2022
Lazy vs hasty: linearization in deep networks impacts learning schedule
  based on example difficulty
Lazy vs hasty: linearization in deep networks impacts learning schedule based on example difficulty
Thomas George
Guillaume Lajoie
A. Baratin
31
5
0
19 Sep 2022
On the Activation Function Dependence of the Spectral Bias of Neural
  Networks
On the Activation Function Dependence of the Spectral Bias of Neural Networks
Q. Hong
Jonathan W. Siegel
Qinyan Tan
Jinchao Xu
34
23
0
09 Aug 2022
Overcoming the Spectral Bias of Neural Value Approximation
Overcoming the Spectral Bias of Neural Value Approximation
Ge Yang
Anurag Ajay
Pulkit Agrawal
34
25
0
09 Jun 2022
The Spectral Bias of Polynomial Neural Networks
The Spectral Bias of Polynomial Neural Networks
Moulik Choraria
L. Dadi
Grigorios G. Chrysos
Julien Mairal
V. Cevher
24
18
0
27 Feb 2022
Overview frequency principle/spectral bias in deep learning
Overview frequency principle/spectral bias in deep learning
Z. Xu
Yaoyu Zhang
Tao Luo
FaML
33
66
0
19 Jan 2022
Understanding Layer-wise Contributions in Deep Neural Networks through
  Spectral Analysis
Understanding Layer-wise Contributions in Deep Neural Networks through Spectral Analysis
Yatin Dandi
Arthur Jacot
FAtt
23
4
0
06 Nov 2021
Fishr: Invariant Gradient Variances for Out-of-Distribution
  Generalization
Fishr: Invariant Gradient Variances for Out-of-Distribution Generalization
Alexandre Ramé
Corentin Dancette
Matthieu Cord
OOD
40
204
0
07 Sep 2021
A Neural Tangent Kernel Perspective of GANs
A Neural Tangent Kernel Perspective of GANs
Jean-Yves Franceschi
Emmanuel de Bézenac
Ibrahim Ayed
Mickaël Chen
Sylvain Lamprier
Patrick Gallinari
34
26
0
10 Jun 2021
Frequency Principle in Deep Learning Beyond Gradient-descent-based
  Training
Frequency Principle in Deep Learning Beyond Gradient-descent-based Training
Yuheng Ma
Zhi-Qin John Xu
Jiwei Zhang
21
7
0
04 Jan 2021
Gradient Starvation: A Learning Proclivity in Neural Networks
Gradient Starvation: A Learning Proclivity in Neural Networks
Mohammad Pezeshki
Sekouba Kaba
Yoshua Bengio
Aaron Courville
Doina Precup
Guillaume Lajoie
MLT
50
257
0
18 Nov 2020
Stable ResNet
Stable ResNet
Soufiane Hayou
Eugenio Clerico
Bo He
George Deligiannidis
Arnaud Doucet
Judith Rousseau
ODL
SSeg
46
51
0
24 Oct 2020
Deep Equals Shallow for ReLU Networks in Kernel Regimes
Deep Equals Shallow for ReLU Networks in Kernel Regimes
A. Bietti
Francis R. Bach
28
86
0
30 Sep 2020
Deep Neural Tangent Kernel and Laplace Kernel Have the Same RKHS
Deep Neural Tangent Kernel and Laplace Kernel Have the Same RKHS
Lin Chen
Sheng Xu
30
93
0
22 Sep 2020
Tensor Programs II: Neural Tangent Kernel for Any Architecture
Tensor Programs II: Neural Tangent Kernel for Any Architecture
Greg Yang
58
134
0
25 Jun 2020
Fourier Features Let Networks Learn High Frequency Functions in Low
  Dimensional Domains
Fourier Features Let Networks Learn High Frequency Functions in Low Dimensional Domains
Matthew Tancik
Pratul P. Srinivasan
B. Mildenhall
Sara Fridovich-Keil
N. Raghavan
Utkarsh Singhal
R. Ramamoorthi
Jonathan T. Barron
Ren Ng
60
2,344
0
18 Jun 2020
Spectra of the Conjugate Kernel and Neural Tangent Kernel for
  linear-width neural networks
Spectra of the Conjugate Kernel and Neural Tangent Kernel for linear-width neural networks
Z. Fan
Zhichao Wang
44
71
0
25 May 2020
Spectrum Dependent Learning Curves in Kernel Regression and Wide Neural
  Networks
Spectrum Dependent Learning Curves in Kernel Regression and Wide Neural Networks
Blake Bordelon
Abdulkadir Canatar
Cengiz Pehlevan
144
201
0
07 Feb 2020
Towards Understanding the Spectral Bias of Deep Learning
Towards Understanding the Spectral Bias of Deep Learning
Yuan Cao
Zhiying Fang
Yue Wu
Ding-Xuan Zhou
Quanquan Gu
32
214
0
03 Dec 2019
Tensor Programs I: Wide Feedforward or Recurrent Neural Networks of Any
  Architecture are Gaussian Processes
Tensor Programs I: Wide Feedforward or Recurrent Neural Networks of Any Architecture are Gaussian Processes
Greg Yang
33
191
0
28 Oct 2019
Neural networks are a priori biased towards Boolean functions with low
  entropy
Neural networks are a priori biased towards Boolean functions with low entropy
Chris Mingard
Joar Skalse
Guillermo Valle Pérez
David Martínez-Rubio
Vladimir Mikulik
A. Louis
FAtt
AI4CE
16
37
0
25 Sep 2019
Adversarial Examples, Uncertainty, and Transfer Testing Robustness in
  Gaussian Process Hybrid Deep Networks
Adversarial Examples, Uncertainty, and Transfer Testing Robustness in Gaussian Process Hybrid Deep Networks
John Bradshaw
A. G. Matthews
Zoubin Ghahramani
BDL
AAML
68
171
0
08 Jul 2017
1