108

Pinet: Optimizing hard-constrained neural networks with orthogonal projection layers

Main:10 Pages
17 Figures
Bibliography:4 Pages
5 Tables
Appendix:20 Pages
Abstract

We introduce an output layer for neural networks that ensures satisfaction of convex constraints. Our approach, Π\Pinet, leverages operator splitting for rapid and reliable projections in the forward pass, and the implicit function theorem for backpropagation. We deploy Π\Pinet as a feasible-by-design optimization proxy for parametric constrained optimization problems and obtain modest-accuracy solutions faster than traditional solvers when solving a single problem, and significantly faster for a batch of problems. We surpass state-of-the-art learning approaches in terms of training time, solution quality, and robustness to hyperparameter tuning, while maintaining similar inference times. Finally, we tackle multi-vehicle motion planning with non-convex trajectory preferences and provide Π\Pinet as a GPU-ready package implemented in JAX with effective tuning heuristics.

View on arXiv
Comments on this paper