170

Thermodynamic bounds on energy use in Deep Neural Networks

Main:6 Pages
2 Figures
1 Tables
Abstract

While Landauer's principle sets a fundamental energy limit for irreversible digital computation, we show that Deep Neural Networks (DNNs) implemented on analog physical substrates can operate under markedly different thermodynamic constraints. We distinguish between two classes of analog systems: dynamic and quasi-static. In dynamic systems, energy dissipation arises from neuron resets, with a lower bound governed by Landauer's principle. To analyse a quasi-static analog platform, we construct an explicit mapping of a generic feedforward DNN onto a physical system described by a model Hamiltonian. In this framework, inference can proceed reversibly, with no minimum free energy cost imposed by thermodynamics. We further analyze the training process in quasi-static analog networks and derive a fundamental lower bound on its energy cost, rooted in the interplay between thermal and statistical noise. Our results suggest that while analog implementations can outperform digital ones during inference, the thermodynamic cost of training scales similarly in both paradigms.

View on arXiv
Comments on this paper