18

Do Neural Networks Lose Plasticity in a Gradually Changing World?

Tianhui Liu
Lili Mou
Main:8 Pages
6 Figures
Bibliography:4 Pages
Appendix:3 Pages
Abstract

Continual learning has become a trending topic in machine learning. Recent studies have discovered an interesting phenomenon called loss of plasticity, referring to neural networks gradually losing the ability to learn new tasks. However, existing plasticity research largely relies on contrived settings with abrupt task transitions, which often do not reflect real-world environments. In this paper, we propose to investigate a gradually changing environment, and we simulate this by input/output interpolation and task sampling. We perform theoretical and empirical analysis, showing that the loss of plasticity is an artifact of abrupt tasks changes in the environment and can be largely mitigated if the world changes gradually.

View on arXiv
Comments on this paper