11
0

Mirror Descent Using the Tempesta Generalized Multi-parametric Logarithms

Main:6 Pages
Bibliography:2 Pages
Abstract

In this paper, we develop a wide class Mirror Descent (MD) algorithms, which play a key role in machine learning. For this purpose we formulated the constrained optimization problem, in which we exploits the Bregman divergence with the Tempesta multi-parametric deformation logarithm as a link function. This link function called also mirror function defines the mapping between the primal and dual spaces and is associated with a very-wide (in fact, theoretically infinite) class of generalized trace-form entropies. In order to derive novel MD updates, we estimate generalized exponential function, which closely approximates the inverse of the multi-parametric Tempesta generalized logarithm. The shape and properties of the Tempesta logarithm and its inverse-deformed exponential functions can be tuned by several hyperparameters. By learning these hyperparameters, we can adapt to distribution or geometry of training data, and we can adjust them to achieve desired properties of MD algorithms. The concept of applying multi-parametric logarithms allow us to generate a new wide and flexible family of MD and mirror-less MD updates.

View on arXiv
@article{cichocki2025_2506.13984,
  title={ Mirror Descent Using the Tempesta Generalized Multi-parametric Logarithms },
  author={ Andrzej Cichocki },
  journal={arXiv preprint arXiv:2506.13984},
  year={ 2025 }
}
Comments on this paper