ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2208.04924
  4. Cited By
On the Activation Function Dependence of the Spectral Bias of Neural
  Networks

On the Activation Function Dependence of the Spectral Bias of Neural Networks

9 August 2022
Q. Hong
Jonathan W. Siegel
Qinyan Tan
Jinchao Xu
ArXivPDFHTML

Papers citing "On the Activation Function Dependence of the Spectral Bias of Neural Networks"

13 / 13 papers shown
Title
Data-Driven Probabilistic Air-Sea Flux Parameterization
Jiarong Wu
Pavel Perezhogin
D. Gagne
Brandon Reichl
Aneesh C. Subramanian
Elizabeth Thompson
Laure Zanna
34
0
0
06 Mar 2025
Orthogonal greedy algorithm for linear operator learning with shallow neural network
Ye Lin
Jiwei Jia
Young Ju Lee
Ran Zhang
26
1
0
06 Jan 2025
On the expressiveness and spectral bias of KANs
On the expressiveness and spectral bias of KANs
Yixuan Wang
Jonathan W. Siegel
Ziming Liu
Thomas Y. Hou
30
9
0
02 Oct 2024
Deep Learning without Global Optimization by Random Fourier Neural Networks
Deep Learning without Global Optimization by Random Fourier Neural Networks
Owen Davis
Gianluca Geraci
Mohammad Motamed
BDL
43
0
0
16 Jul 2024
Understanding the dynamics of the frequency bias in neural networks
Understanding the dynamics of the frequency bias in neural networks
Juan Molina
Mircea Petrache
F. Sahli Costabal
Matías Courdurier
14
1
0
23 May 2024
Parametric Encoding with Attention and Convolution Mitigate Spectral
  Bias of Neural Partial Differential Equation Solvers
Parametric Encoding with Attention and Convolution Mitigate Spectral Bias of Neural Partial Differential Equation Solvers
Mehdi Shishehbor
Shirin Hosseinmardi
Ramin Bostanabad
AI4CE
19
5
0
22 Mar 2024
Neural Redshift: Random Networks are not Random Functions
Neural Redshift: Random Networks are not Random Functions
Damien Teney
A. Nicolicioiu
Valentin Hartmann
Ehsan Abbasnejad
86
18
0
04 Mar 2024
Comparing Spectral Bias and Robustness For Two-Layer Neural Networks:
  SGD vs Adaptive Random Fourier Features
Comparing Spectral Bias and Robustness For Two-Layer Neural Networks: SGD vs Adaptive Random Fourier Features
Aku Kammonen
Lisi Liang
Anamika Pandey
Raúl Tempone
24
2
0
01 Feb 2024
Homotopy Relaxation Training Algorithms for Infinite-Width Two-Layer
  ReLU Neural Networks
Homotopy Relaxation Training Algorithms for Infinite-Width Two-Layer ReLU Neural Networks
Yahong Yang
Qipin Chen
Wenrui Hao
14
4
0
26 Sep 2023
Why Shallow Networks Struggle with Approximating and Learning High
  Frequency: A Numerical Study
Why Shallow Networks Struggle with Approximating and Learning High Frequency: A Numerical Study
Shijun Zhang
Hongkai Zhao
Yimin Zhong
Haomin Zhou
6
7
0
29 Jun 2023
Reliable extrapolation of deep neural operators informed by physics or
  sparse observations
Reliable extrapolation of deep neural operators informed by physics or sparse observations
Min Zhu
Handi Zhang
Anran Jiao
George Karniadakis
Lu Lu
29
90
0
13 Dec 2022
A Lightweight and Gradient-Stable Neural Layer
A Lightweight and Gradient-Stable Neural Layer
Yueyao Yu
Yin Zhang
8
0
0
08 Jun 2021
Frequency Principle in Deep Learning Beyond Gradient-descent-based
  Training
Frequency Principle in Deep Learning Beyond Gradient-descent-based Training
Yuheng Ma
Zhi-Qin John Xu
Jiwei Zhang
14
7
0
04 Jan 2021
1