ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2209.06119
  4. Cited By
APTx: better activation function than MISH, SWISH, and ReLU's variants used in deep learning

APTx: better activation function than MISH, SWISH, and ReLU's variants used in deep learning

10 September 2022
Ravin Kumar
ArXivPDFHTML

Papers citing "APTx: better activation function than MISH, SWISH, and ReLU's variants used in deep learning"

3 / 3 papers shown
Title
Improving Classification Neural Networks by using Absolute activation
  function (MNIST/LeNET-5 example)
Improving Classification Neural Networks by using Absolute activation function (MNIST/LeNET-5 example)
Oleg I.Berngardt
9
2
0
23 Apr 2023
Review and Comparison of Commonly Used Activation Functions for Deep
  Neural Networks
Review and Comparison of Commonly Used Activation Functions for Deep Neural Networks
Tomasz Szandała
54
269
0
15 Oct 2020
Deeply learned face representations are sparse, selective, and robust
Deeply learned face representations are sparse, selective, and robust
Yi Sun
Xiaogang Wang
Xiaoou Tang
CVBM
248
918
0
03 Dec 2014
1