29
0

Oja's plasticity rule overcomes several challenges of training neural networks under biological constraints

Abstract

Deep neural networks have achieved impressive performance through carefully engineered training strategies. Nonetheless, such methods lack parallels in biological neural circuits, relying heavily on non-local credit assignment, precise initialization, normalization layers, batch processing, and large datasets. Biologically plausible plasticity rules, such as random feedback alignment, often suffer from instability and unbounded weight growth without these engineered methods, while Hebbian-type schemes fail to provide goal-oriented credit. In this study, we demonstrate that incorporating Oja's plasticity rule into error-driven training yields stable, efficient learning in feedforward and recurrent architectures, obviating the need for carefully engineered tricks. Our results show that Oja's rule preserves richer activation subspaces, mitigates exploding or vanishing signals, and improves short-term memory in recurrent networks. Notably, meta-learned local plasticity rules incorporating Oja's principle not only match but surpass standard backpropagation in data-scarce regimes. These findings reveal a biologically grounded pathway bridging engineered deep networks and plausible synaptic mechanisms.

View on arXiv
@article{shervani-tabar2025_2408.08408,
  title={ Oja's plasticity rule overcomes several challenges of training neural networks under biological constraints },
  author={ Navid Shervani-Tabar and Marzieh Alireza Mirhoseini and Robert Rosenbaum },
  journal={arXiv preprint arXiv:2408.08408},
  year={ 2025 }
}
Comments on this paper