HoP: Homeomorphic Polar Learning for Hard Constrained Optimization

Constrained optimization demands highly efficient solvers which promotes the development of learn-to-optimize (L2O) approaches. As a data-driven method, L2O leverages neural networks to efficiently produce approximate solutions. However, a significant challenge remains in ensuring both optimality and feasibility of neural networks' output. To tackle this issue, we introduce Homeomorphic Polar Learning (HoP) to solve the star-convex hard-constrained optimization by embedding homeomorphic mapping in neural networks. The bijective structure enables end-to-end training without extra penalty or correction. For performance evaluation, we evaluate HoP's performance across a variety of synthetic optimization tasks and real-world applications in wireless communications. In all cases, HoP achieves solutions closer to the optimum than existing L2O methods while strictly maintaining feasibility.
View on arXiv@article{deng2025_2502.00304, title={ HoP: Homeomorphic Polar Learning for Hard Constrained Optimization }, author={ Ke Deng and Hanwen Zhang and Jin Lu and Haijian Sun }, journal={arXiv preprint arXiv:2502.00304}, year={ 2025 } }