Deformed statistics Kullback-Leibler divergence minimization within a scaled Bregman framework
From MaRDI portal
Publication:1928038
DOI10.1016/j.physleta.2011.09.021zbMath1254.82003arXiv1102.1025OpenAlexW2092766144MaRDI QIDQ1928038
R. C. Venkatesan, Angel Plastino
Publication date: 2 January 2013
Published in: Physics Letters. A (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1102.1025
Pythagorean theoremgeneralized Tsallis statisticsadditive dualitydual generalized Kullback-Leibler divergencescaled Bregman divergences
Classical equilibrium statistical mechanics (general) (82B05) Measures of information, entropy (94A17) Statistical thermodynamics (82B30)
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Why least squares and maximum entropy? An axiomatic approach to inference for linear inverse problems
- I-divergence geometry of probability distributions and minimization problems
- Tsallis' entropy maximization procedure revisited
- Comments on and correction to "Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy" (Jan 80 26-37) [Corresp.]
- Introduction to Nonextensive Statistical Mechanics
- Properties of cross-entropy minimization
- Information geometry and Plefka's mean-field theory
- Information gain within nonextensive thermostatistics
- 10.1162/153244303322753689
- Equivalence of the four versions of Tsallis’s statistics
- ON THE PROBLEM OF CONSTRAINTS IN NONEXTENSIVE FORMALISM: A QUANTUM MECHANICAL TREATMENT
- A Note on Minimum Discrimination Information
- Information Theory and Statistics: A Tutorial
- Relative loss bounds for on-line density estimation with the exponential family of distributions
This page was built for publication: Deformed statistics Kullback-Leibler divergence minimization within a scaled Bregman framework