40

Training deep physical neural networks with local physical information bottleneck

Hao Wang
Ziao Wang
Xiangpeng Liang
Han Zhao
Jianqi Hu
Junjie Jiang
Xing Fu
Jianshi Tang
Huaqiang Wu
Sylvain Gigan
Qiang Liu
Main:9 Pages
4 Figures
1 Tables
Abstract

Deep learning has revolutionized modern society but faces growing energy and latency constraints. Deep physical neural networks (PNNs) are interconnected computing systems that directly exploit analog dynamics for energy-efficient, ultrafast AI execution. Realizing this potential, however, requires universal training methods tailored to physical intricacies. Here, we present the Physical Information Bottleneck (PIB), a general and efficient framework that integrates information theory and local learning, enabling deep PNNs to learn under arbitrary physical dynamics. By allocating matrix-based information bottlenecks to each unit, we demonstrate supervised, unsupervised, and reinforcement learning across electronic memristive chips and optical computing platforms. PIB also adapts to severe hardware faults and allows for parallel training via geographically distributed resources. Bypassing auxiliary digital models and contrastive measurements, PIB recasts PNN training as an intrinsic, scalable information-theoretic process compatible with diverse physical substrates.

View on arXiv
Comments on this paper