ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2305.18353
11
9

Emergent representations in networks trained with the Forward-Forward algorithm

26 May 2023
Niccolo Tosato
Lorenzo Basile
Emanuele Ballarin
Giuseppe de Alteriis
Alberto Cazzaniga
A. Ansuini
ArXivPDFHTML
Abstract

The Backpropagation algorithm has often been criticised for its lack of biological realism. In an attempt to find a more biologically plausible alternative, the recently introduced Forward-Forward algorithm replaces the forward and backward passes of Backpropagation with two forward passes. In this work, we show that the internal representations obtained by the Forward-Forward algorithm can organise into category-specific ensembles exhibiting high sparsity -- composed of a low number of active units. This situation is reminiscent of what has been observed in cortical sensory areas, where neuronal ensembles are suggested to serve as the functional building blocks for perception and action. Interestingly, while this sparse pattern does not typically arise in models trained with standard Backpropagation, it can emerge in networks trained with Backpropagation on the same objective proposed for the Forward-Forward algorithm.

View on arXiv
@article{tosato2025_2305.18353,
  title={ Emergent representations in networks trained with the Forward-Forward algorithm },
  author={ Niccolò Tosato and Lorenzo Basile and Emanuele Ballarin and Giuseppe de Alteriis and Alberto Cazzaniga and Alessio Ansuini },
  journal={arXiv preprint arXiv:2305.18353},
  year={ 2025 }
}
Comments on this paper