Duality of maximum entropy and minimum divergence
From MaRDI portal
Publication:296478
DOI10.3390/e16073552zbMath1338.62021OpenAlexW1992146538MaRDI QIDQ296478
Shinto Eguchi, Osamu Komori, Atsumi Ohara
Publication date: 15 June 2016
Published in: Entropy (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3390/e16073552
information geometrysufficiencymultivariate \(t\)-distribution\(\beta\)-divergencedual connectionsMaxEntpower exponential family
Directional data; spatial statistics (62H11) Statistical aspects of information-theoretic topics (62B10)
Related Items (3)
Dually flat geometries of the deformed exponential family ⋮ Least informative distributions in maximum \(q\)-log-likelihood estimation ⋮ Geometry of parametric binary choice models
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Group invariance of information geometry on \(q\)-Gaussian distributions induced by beta-divergence
- Nonparametric information geometry: from divergence function to referential-representational biduality on statistical manifolds
- \(F\)-geometry and Amari's \(\alpha\)-geometry on a statistical manifold
- Projective power entropy and maximum Tsallis entropy distributions
- Entropy and divergence associated with power function and the statistical application
- Families of alpha-, beta- and gamma-divergences: flexible and robust measures of similarities
- Generalised exponential families and associated entropy functions
- Robust parameter estimation with a small bias against heavy contamination
- Second order efficiency of minimum contrast estimators in a curved exponential family
- Geometry of minimum contrast
- Statistics, yokes and symplectic geometry
- Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory
- A infinite-dimensional geometric structure on the space of all the probability measures equivalent to a given one
- Possible generalization of Boltzmann-Gibbs statistics.
- Robust estimation in the normal mixture model
- Robust Blind Source Separation by Beta Divergence
- Information geometry ofq-Gaussian densities and behaviors of solutions to related diffusion equations
- Nonnegative Matrix Factorization with the Itakura-Saito Divergence: With Application to Music Analysis
- Robust and efficient estimation by minimising a density power divergence
- A class of logistic-type discriminant functions
- Robustifying AdaBoost by Adding the Naive Error Rate
- Information Geometry of U-Boost and Bregman Divergence
- Measurement of Diversity
This page was built for publication: Duality of maximum entropy and minimum divergence