ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1911.03925
4
10

Symmetrical Gaussian Error Linear Units (SGELUs)

10 November 2019
Chao Yu
Zhiguo Su
ArXivPDFHTML
Abstract

In this paper, a novel neural network activation function, called Symmetrical Gaussian Error Linear Unit (SGELU), is proposed to obtain high performance. It is achieved by effectively integrating the property of the stochastic regularizer in the Gaussian Error Linear Unit (GELU) with the symmetrical characteristics. Combining with these two merits, the proposed unit introduces the capability of the bidirection convergence to successfully optimize the network without the gradient diminishing problem. The evaluations of SGELU against GELU and Linearly Scaled Hyperbolic Tangent (LiSHT) have been carried out on MNIST classification and MNIST auto-encoder, which provide great validations in terms of the performance, the convergence rate among these applications.

View on arXiv
Comments on this paper