Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1806.10064
Cited By
Adaptive Blending Units: Trainable Activation Functions for Deep Neural Networks
26 June 2018
L. R. Sütfeld
Flemming Brieger
Holger Finger
S. Füllhase
G. Pipa
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Adaptive Blending Units: Trainable Activation Functions for Deep Neural Networks"
8 / 8 papers shown
Title
Semiring Activation in Neural Networks
B. Smets
Peter D. Donker
Jim W. Portegies
LLMSV
21
0
0
29 May 2024
Nonlinearity Enhanced Adaptive Activation Functions
David Yevick
25
1
0
29 Mar 2024
Learning Specialized Activation Functions for Physics-informed Neural Networks
Honghui Wang
Lu Lu
Shiji Song
Gao Huang
PINN
AI4CE
16
12
0
08 Aug 2023
Bayesian optimization for sparse neural networks with trainable activation functions
M. Fakhfakh
Lotfi Chaari
20
2
0
10 Apr 2023
How important are activation functions in regression and classification? A survey, performance comparison, and future directions
Ameya Dilip Jagtap
George Karniadakis
AI4CE
37
71
0
06 Sep 2022
Activation Functions in Deep Learning: A Comprehensive Survey and Benchmark
S. Dubey
S. Singh
B. B. Chaudhuri
41
643
0
29 Sep 2021
Parametric Flatten-T Swish: An Adaptive Non-linear Activation Function For Deep Learning
Hock Hung Chieng
Noorhaniza Wahid
P. Ong
21
6
0
06 Nov 2020
A survey on modern trainable activation functions
Andrea Apicella
Francesco Donnarumma
Francesco Isgrò
R. Prevete
36
365
0
02 May 2020
1