ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2209.02681
  4. Cited By
How important are activation functions in regression and classification?
  A survey, performance comparison, and future directions

How important are activation functions in regression and classification? A survey, performance comparison, and future directions

6 September 2022
Ameya Dilip Jagtap
George Karniadakis
    AI4CE
ArXivPDFHTML

Papers citing "How important are activation functions in regression and classification? A survey, performance comparison, and future directions"

27 / 27 papers shown
Title
Anant-Net: Breaking the Curse of Dimensionality with Scalable and Interpretable Neural Surrogate for High-Dimensional PDEs
Anant-Net: Breaking the Curse of Dimensionality with Scalable and Interpretable Neural Surrogate for High-Dimensional PDEs
Sidharth S. Menon
Ameya D. Jagtap
PINN
139
0
0
06 May 2025
Do We Always Need the Simplicity Bias? Looking for Optimal Inductive Biases in the Wild
Damien Teney
Liangze Jiang
Florin Gogianu
Ehsan Abbasnejad
169
0
0
13 Mar 2025
Kolmogorov-Arnold Networks in Low-Data Regimes: A Comparative Study with
  Multilayer Perceptrons
Kolmogorov-Arnold Networks in Low-Data Regimes: A Comparative Study with Multilayer Perceptrons
Farhad Pourkamali-Anaraki
35
5
0
16 Sep 2024
Back to the Continuous Attractor
Back to the Continuous Attractor
Ábel Ságodi
Guillermo Martín-Sánchez
Piotr Sokól
Il Memming Park
30
2
0
31 Jul 2024
Can all variations within the unified mask-based beamformer framework achieve identical peak extraction performance?
Can all variations within the unified mask-based beamformer framework achieve identical peak extraction performance?
Atsuo Hiroe
Katsutoshi Itoyama
Kazuhiro Nakadai
37
0
0
22 Jul 2024
fKAN: Fractional Kolmogorov-Arnold Networks with trainable Jacobi basis
  functions
fKAN: Fractional Kolmogorov-Arnold Networks with trainable Jacobi basis functions
Alireza Afzal Aghaei
40
47
0
11 Jun 2024
Nonlinearity Enhanced Adaptive Activation Functions
Nonlinearity Enhanced Adaptive Activation Functions
David Yevick
25
1
0
29 Mar 2024
Probabilistic Neural Networks (PNNs) for Modeling Aleatoric Uncertainty
  in Scientific Machine Learning
Probabilistic Neural Networks (PNNs) for Modeling Aleatoric Uncertainty in Scientific Machine Learning
Farhad Pourkamali-Anaraki
Jamal F. Husseini
Scott E. Stapleton
UD
48
2
0
21 Feb 2024
Kolmogorov n-Widths for Multitask Physics-Informed Machine Learning
  (PIML) Methods: Towards Robust Metrics
Kolmogorov n-Widths for Multitask Physics-Informed Machine Learning (PIML) Methods: Towards Robust Metrics
Michael Penwarden
H. Owhadi
Robert M. Kirby
AI4CE
22
1
0
16 Feb 2024
Adaptive Activation Functions for Predictive Modeling with Sparse
  Experimental Data
Adaptive Activation Functions for Predictive Modeling with Sparse Experimental Data
Farhad Pourkamali-Anaraki
Tahamina Nasrin
Robert E. Jensen
Amy M. Peterson
Christopher J. Hansen
24
7
0
08 Feb 2024
RiemannONets: Interpretable Neural Operators for Riemann Problems
RiemannONets: Interpretable Neural Operators for Riemann Problems
Ahmad Peyvan
Vivek Oommen
Ameya Dilip Jagtap
George Karniadakis
AI4CE
38
22
0
16 Jan 2024
Data-driven localized waves and parameter discovery in the massive
  Thirring model via extended physics-informed neural networks with interface
  zones
Data-driven localized waves and parameter discovery in the massive Thirring model via extended physics-informed neural networks with interface zones
Christian Berger
Sadok Ben Toumia
Zijian Zhou
Zhenya Yan
PINN
28
8
0
29 Sep 2023
Deep smoothness WENO scheme for two-dimensional hyperbolic conservation
  laws: A deep learning approach for learning smoothness indicators
Deep smoothness WENO scheme for two-dimensional hyperbolic conservation laws: A deep learning approach for learning smoothness indicators
Tatiana Kossaczká
Ameya Dilip Jagtap
Matthias Ehrhardt
15
1
0
18 Sep 2023
Physics-informed neural networks for predicting gas flow dynamics and
  unknown parameters in diesel engines
Physics-informed neural networks for predicting gas flow dynamics and unknown parameters in diesel engines
Kamaljyoti Nath
Xuhui Meng
Daniel J. Smith
George Karniadakis
PINN
20
20
0
26 Apr 2023
iPINNs: Incremental learning for Physics-informed neural networks
iPINNs: Incremental learning for Physics-informed neural networks
Aleksandr Dekhovich
M. Sluiter
David Tax
Miguel A. Bessa
AI4CE
DiffM
20
10
0
10 Apr 2023
Learning solution of nonlinear constitutive material models using
  physics-informed neural networks: COMM-PINN
Learning solution of nonlinear constitutive material models using physics-informed neural networks: COMM-PINN
Shahed Rezaei
Ahmad Moeineddin
Ali Harandi
PINN
32
18
0
10 Apr 2023
A unified scalable framework for causal sweeping strategies for
  Physics-Informed Neural Networks (PINNs) and their temporal decompositions
A unified scalable framework for causal sweeping strategies for Physics-Informed Neural Networks (PINNs) and their temporal decompositions
Michael Penwarden
Ameya Dilip Jagtap
Shandian Zhe
George Karniadakis
Robert M. Kirby
PINN
AI4CE
23
57
0
28 Feb 2023
Learning stiff chemical kinetics using extended deep neural operators
Learning stiff chemical kinetics using extended deep neural operators
S. Goswami
Ameya Dilip Jagtap
H. Babaee
Bryan T. Susi
George Karniadakis
AI4CE
35
37
0
23 Feb 2023
Mixed formulation of physics-informed neural networks for
  thermo-mechanically coupled systems and heterogeneous domains
Mixed formulation of physics-informed neural networks for thermo-mechanically coupled systems and heterogeneous domains
Ali Harandi
Ahmad Moeineddin
Michael Kaliske
Stefanie Reese
Shahed Rezaei
AI4CE
PINN
25
42
0
09 Feb 2023
Self-Supervised Learning for Data Scarcity in a Fatigue Damage
  Prognostic Problem
Self-Supervised Learning for Data Scarcity in a Fatigue Damage Prognostic Problem
A. Akrim
C. Gogu
R. Vingerhoeds
M. Salaün
AI4CE
32
23
0
20 Jan 2023
Physics-informed Neural Networks with Periodic Activation Functions for
  Solute Transport in Heterogeneous Porous Media
Physics-informed Neural Networks with Periodic Activation Functions for Solute Transport in Heterogeneous Porous Media
Salah A. Faroughi
Ramin Soltanmohammad
Pingki Datta
S. K. Mahjour
S. Faroughi
18
22
0
17 Dec 2022
Modular machine learning-based elastoplasticity: generalization in the
  context of limited data
Modular machine learning-based elastoplasticity: generalization in the context of limited data
J. Fuhg
Craig M. Hamel
K. Johnson
Reese E. Jones
N. Bouklas
29
48
0
15 Oct 2022
A Lightweight and Gradient-Stable Neural Layer
A Lightweight and Gradient-Stable Neural Layer
Yueyao Yu
Yin Zhang
26
0
0
08 Jun 2021
Parallel Physics-Informed Neural Networks via Domain Decomposition
Parallel Physics-Informed Neural Networks via Domain Decomposition
K. Shukla
Ameya Dilip Jagtap
George Karniadakis
PINN
101
274
0
20 Apr 2021
Review and Comparison of Commonly Used Activation Functions for Deep
  Neural Networks
Review and Comparison of Commonly Used Activation Functions for Deep Neural Networks
Tomasz Szandała
59
274
0
15 Oct 2020
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision
  Applications
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
950
20,567
0
17 Apr 2017
Incremental Network Quantization: Towards Lossless CNNs with
  Low-Precision Weights
Incremental Network Quantization: Towards Lossless CNNs with Low-Precision Weights
Aojun Zhou
Anbang Yao
Yiwen Guo
Lin Xu
Yurong Chen
MQ
331
1,049
0
10 Feb 2017
1