ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2311.02557
  4. Cited By
Fast Minimization of Expected Logarithmic Loss via Stochastic Dual
  Averaging

Fast Minimization of Expected Logarithmic Loss via Stochastic Dual Averaging

5 November 2023
C. Tsai
Hao-Chung Cheng
Yen-Huan Li
ArXivPDFHTML

Papers citing "Fast Minimization of Expected Logarithmic Loss via Stochastic Dual Averaging"

3 / 3 papers shown
Title
Online Self-Concordant and Relatively Smooth Minimization, With
  Applications to Online Portfolio Selection and Learning Quantum States
Online Self-Concordant and Relatively Smooth Minimization, With Applications to Online Portfolio Selection and Learning Quantum States
C. Tsai
Hao-Chung Cheng
Yen-Huan Li
25
8
0
03 Oct 2022
Pushing the Efficiency-Regret Pareto Frontier for Online Learning of
  Portfolios and Quantum States
Pushing the Efficiency-Regret Pareto Frontier for Online Learning of Portfolios and Quantum States
Julian Zimmert
Naman Agarwal
Satyen Kale
19
17
0
06 Feb 2022
Optimal Distributed Online Prediction using Mini-Batches
Optimal Distributed Online Prediction using Mini-Batches
O. Dekel
Ran Gilad-Bachrach
Ohad Shamir
Lin Xiao
164
683
0
07 Dec 2010
1