ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.06098
24
1

A comparative study of back propagation and its alternatives on multilayer perceptrons

31 May 2022
John F. Waldo
    AAML
ArXivPDFHTML
Abstract

The de facto algorithm for training the back pass of a feedforward neural network is backpropagation (BP). The use of almost-everywhere differentiable activation functions made it efficient and effective to propagate the gradient backwards through layers of deep neural networks. However, in recent years, there has been much research in alternatives to backpropagation. This analysis has largely focused on reaching state-of-the-art accuracy in multilayer perceptrons (MLPs) and convolutional neural networks (CNNs). In this paper, we analyze the stability and similarity of predictions and neurons in MLPs and propose a new variation of one of the algorithms.

View on arXiv
Comments on this paper