ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2305.00241
47
30

When Deep Learning Meets Polyhedral Theory: A Survey

29 April 2023
Joey Huchette
Gonzalo Muñoz
Thiago Serra
Calvin Tsay
    AI4CE
ArXivPDFHTML
Abstract

In the past decade, deep learning became the prevalent methodology for predictive modeling thanks to the remarkable accuracy of deep neural networks in tasks such as computer vision and natural language processing. Meanwhile, the structure of neural networks converged back to simpler representations based on piecewise constant and piecewise linear functions such as the Rectified Linear Unit (ReLU), which became the most commonly used type of activation function in neural networks. That made certain types of network structure \unicodex2014\unicode{x2014}\unicodex2014such as the typical fully-connected feedforward neural network\unicodex2014\unicode{x2014}\unicodex2014 amenable to analysis through polyhedral theory and to the application of methodologies such as Linear Programming (LP) and Mixed-Integer Linear Programming (MILP) for a variety of purposes. In this paper, we survey the main topics emerging from this fast-paced area of work, which bring a fresh perspective to understanding neural networks in more detail as well as to applying linear optimization techniques to train, verify, and reduce the size of such networks.

View on arXiv
@article{huchette2025_2305.00241,
  title={ When Deep Learning Meets Polyhedral Theory: A Survey },
  author={ Joey Huchette and Gonzalo Muñoz and Thiago Serra and Calvin Tsay },
  journal={arXiv preprint arXiv:2305.00241},
  year={ 2025 }
}
Comments on this paper