ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2011.06064
8
5

Non-local Optimization: Imposing Structure on Optimization Problems by Relaxation

11 November 2020
Nils Müller
Tobias Glasmachers
ArXivPDFHTML
Abstract

In stochastic optimization, particularly in evolutionary computation and reinforcement learning, the optimization of a function f:Ω→Rf: \Omega \to \mathbb{R}f:Ω→R is often addressed through optimizing a so-called relaxation θ∈Θ↦Eθ(f)\theta \in \Theta \mapsto \mathbb{E}_\theta(f)θ∈Θ↦Eθ​(f) of fff, where Θ\ThetaΘ resembles the parameters of a family of probability measures on Ω\OmegaΩ. We investigate the structure of such relaxations by means of measure theory and Fourier analysis, enabling us to shed light on the success of many associated stochastic optimization methods. The main structural traits we derive and that allow fast and reliable optimization of relaxations are the consistency of optimal values of fff, Lipschitzness of gradients, and convexity. We emphasize settings where fff itself is not differentiable or convex, e.g., in the presence of (stochastic) disturbance.

View on arXiv
Comments on this paper