Log-determinant divergences revisited: alpha-beta and gamma log-det divergences

From MaRDI portal
Publication:296329

DOI10.3390/E17052988zbMATH Open1338.94034DBLPjournals/entropy/CichockiCA15arXiv1412.7146OpenAlexW2963959237WikidataQ60486507 ScholiaQ60486507MaRDI QIDQ296329FDOQ296329


Authors: Andrzej Cichocki, Sergio Cruces, Shun-Ichi Amari Edit this on Wikidata


Publication date: 15 June 2016

Published in: Entropy (Search for Journal in Brave)

Abstract: In this paper, we review and extend a family of log-det divergences for symmetric positive definite (SPD) matrices and discuss their fundamental properties. We show how to generate from parameterized Alpha-Beta (AB) and Gamma Log-det divergences many well known divergences, for example, the Stein's loss, S-divergence, called also Jensen-Bregman LogDet (JBLD) divergence, the Logdet Zero (Bhattacharryya) divergence, Affine Invariant Riemannian Metric (AIRM) as well as some new divergences. Moreover, we establish links and correspondences among many log-det divergences and display them on alpha-beta plain for various set of parameters. Furthermore, this paper bridges these divergences and shows also their links to divergences of multivariate and multiway Gaussian distributions. Closed form formulas are derived for gamma divergences of two multivariate Gaussian densities including as special cases the Kullback-Leibler, Bhattacharryya, R'enyi and Cauchy-Schwartz divergences. Symmetrized versions of the log-det divergences are also discussed and reviewed. A class of divergences is extended to multiway divergences for separable covariance (precision) matrices.


Full work available at URL: https://arxiv.org/abs/1412.7146






Cites Work


Cited In (12)





This page was built for publication: Log-determinant divergences revisited: alpha-beta and gamma log-det divergences

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q296329)