Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2209.06119
Cited By
APTx: better activation function than MISH, SWISH, and ReLU's variants used in deep learning
10 September 2022
Ravin Kumar
Re-assign community
ArXiv
PDF
HTML
Papers citing
"APTx: better activation function than MISH, SWISH, and ReLU's variants used in deep learning"
3 / 3 papers shown
Title
Improving Classification Neural Networks by using Absolute activation function (MNIST/LeNET-5 example)
Oleg I.Berngardt
9
2
0
23 Apr 2023
Review and Comparison of Commonly Used Activation Functions for Deep Neural Networks
Tomasz Szandała
54
272
0
15 Oct 2020
Deeply learned face representations are sparse, selective, and robust
Yi Sun
Xiaogang Wang
Xiaoou Tang
CVBM
248
921
0
03 Dec 2014
1