ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2311.02557
12
0

Fast Minimization of Expected Logarithmic Loss via Stochastic Dual Averaging

5 November 2023
C. Tsai
Hao-Chung Cheng
Yen-Huan Li
ArXivPDFHTML
Abstract

Consider the problem of minimizing an expected logarithmic loss over either the probability simplex or the set of quantum density matrices. This problem includes tasks such as solving the Poisson inverse problem, computing the maximum-likelihood estimate for quantum state tomography, and approximating positive semi-definite matrix permanents with the currently tightest approximation ratio. Although the optimization problem is convex, standard iteration complexity guarantees for first-order methods do not directly apply due to the absence of Lipschitz continuity and smoothness in the loss function. In this work, we propose a stochastic first-order algorithm named BBB-sample stochastic dual averaging with the logarithmic barrier. For the Poisson inverse problem, our algorithm attains an ε\varepsilonε-optimal solution in O~(d2/ε2)\smash{\tilde{O}}(d^2/\varepsilon^2)O~(d2/ε2) time, matching the state of the art, where ddd denotes the dimension. When computing the maximum-likelihood estimate for quantum state tomography, our algorithm yields an ε\varepsilonε-optimal solution in O~(d3/ε2)\smash{\tilde{O}}(d^3/\varepsilon^2)O~(d3/ε2) time. This improves on the time complexities of existing stochastic first-order methods by a factor of dω−2d^{\omega-2}dω−2 and those of batch methods by a factor of d2d^2d2, where ω\omegaω denotes the matrix multiplication exponent. Numerical experiments demonstrate that empirically, our algorithm outperforms existing methods with explicit complexity guarantees.

View on arXiv
Comments on this paper