Scalable Interconnect Learning in Boolean Networks
Fabian Kresse
Emily Yu
Christoph H. Lampert

Main:8 Pages
11 Figures
Bibliography:2 Pages
2 Tables
Appendix:2 Pages
Abstract
Learned Differentiable Boolean Logic Networks (DBNs) already deliver efficient inference on resource-constrained hardware. We extend them with a trainable, differentiable interconnect whose parameter count remains constant as input width grows, allowing DBNs to scale to far wider layers than earlier learnable-interconnect designs while preserving their advantageous accuracy. To further reduce model size, we propose two complementary pruning stages: an SAT-based logic equivalence pass that removes redundant gates without affecting performance, and a similarity-based, data-driven pass that outperforms a magnitude-style greedy baseline and offers a superior compression-accuracy trade-off.
View on arXiv@article{kresse2025_2507.02585, title={ Scalable Interconnect Learning in Boolean Networks }, author={ Fabian Kresse and Emily Yu and Christoph H. Lampert }, journal={arXiv preprint arXiv:2507.02585}, year={ 2025 } }
Comments on this paper