Conformal mirror descent with logarithmic divergences
From MaRDI portal
Publication:6138802
Abstract: The logarithmic divergence is an extension of the Bregman divergence motivated by optimal transport and a generalized convex duality, and satisfies many remarkable properties. Using the geometry induced by the logarithmic divergence, we introduce a generalization of continuous time mirror descent that we term the conformal mirror descent. We derive its dynamics under a generalized mirror map, and show that it is a time change of a corresponding Hessian gradient flow. We also prove convergence results in continuous time. We apply the conformal mirror descent to online estimation of a generalized exponential family, and construct a family of gradient flows on the unit simplex via the Dirichlet optimal transport problem.
Recommendations
Cites work
- scientific article; zbMATH DE number 3790208 (Why is no real title available?)
- scientific article; zbMATH DE number 1560711 (Why is no real title available?)
- scientific article; zbMATH DE number 1746020 (Why is no real title available?)
- scientific article; zbMATH DE number 1909499 (Why is no real title available?)
- scientific article; zbMATH DE number 3296905 (Why is no real title available?)
- A Relationship Between Arbitrary Positive Matrices and Doubly Stochastic Matrices
- A gradient descent perspective on Sinkhorn
- A regression model for compositional data based on the shifted-Dirichlet distribution
- A variational perspective on accelerated methods in optimization
- Clustering with Bregman divergences.
- Continuity, curvature, and the general covariance of optimal transportation
- Cramér-Rao lower bounds arising from generalized Csiszár divergences
- Exponentially concave functions and a new information geometry
- Generalised Thermostatistics
- Geometry of minimum contrast
- Gradient systems in view of information geometry
- Hessian Riemannian Gradient Flows in Convex Programming
- Information Geometry of U-Boost and Bregman Divergence
- Information geometry
- Information geometry
- Information geometry and its applications
- Information geometry in portfolio theory
- Isometric logratio transformations for compositional data analysis
- Logarithmic divergences from optimal transport and Rényi geometry
- Logarithmic divergences: geometry and interpretation of curvature
- Logistic regression, AdaBoost and Bregman distances
- Minimum Divergence Methods in Statistical Machine Learning
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- Multiplicative Schrödinger problem and the Dirichlet transport
- On Conformal Divergences and Their Population Minimizers
- Polar factorization and monotone rearrangement of vector‐valued functions
- Projection theorems and estimating equations for power-law models
- Pseudo-Riemannian geometry encodes information geometry in optimal transport
- Rényi Divergence and Kullback-Leibler Divergence
- Stability of a 4th-order curvature condition arising in optimal transport theory
- The Information Geometry of Mirror Descent
- The geometry of Hessian structures
- The geometry of relative arbitrage
- Tsallis and Rényi Deformations Linked via a New λ-Duality
Cited in
(1)
This page was built for publication: Conformal mirror descent with logarithmic divergences
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6138802)