On the maximum entropy principle and the minimization of the Fisher information in Tsallis statistics
From MaRDI portal
Publication:3650872
Abstract: We give a new proof of the theorems on the maximum entropy principle in Tsallis statistics. That is, we show that the -canonical distribution attains the maximum value of the Tsallis entropy, subject to the constraint on the -expectation value and the -Gaussian distribution attains the maximum value of the Tsallis entropy, subject to the constraint on the -variance, as applications of the nonnegativity of the Tsallis relative entropy, without using the Lagrange multipliers method. In addition, we define a -Fisher information and then prove a -Cram'er-Rao inequality that the -Gaussian distribution with special -variances attains the minimum value of the -Fisher information.
Recommendations
- The unique non self-referential \(q\)-canonical distribution and the physical temperature derived from the maximum entropy principle in Tsallis statistics
- On a (\(\beta, q)\)-generalized Fisher information and inequalities involving \(q\)-Gaussian distributions
- THE MAXIMUM ENTROPY PRINCIPLE FOR GENERALIZED ENTROPIES
- Some properties of generalized Fisher information in the context of nonextensive thermostatistics
- Tsallis distribution as a standard maximum entropy solution with `tail' constraint
Cites work
- scientific article; zbMATH DE number 2131215 (Why is no real title available?)
- scientific article; zbMATH DE number 1754708 (Why is no real title available?)
- Fundamental properties of Tsallis relative entropy
- Heat and entropy in nonextensive thermodynamics: transmutation from Tsallis theory to Rényi-entropy-based theory
- Information theoretical properties of Tsallis entropies
- Law of Error in Tsallis Statistics
- Nonextensive thermodynamic relations
- On a \(q\) -central limit theorem consistent with nonextensive statistical mechanics
- Possible generalization of Boltzmann-Gibbs statistics.
- Tsallis' entropy maximization procedure revisited
Cited in
(20)- Law of Error in Tsallis Statistics
- Inequalities for Tsallis relative entropy and generalized skew information
- Tsallis’ entropies — axiomatics, associated f -divergences and Fisher’s information
- Geometry of \(q\)-exponential family of probability distributions
- Projective power entropy and maximum Tsallis entropy distributions
- Cramér-Rao lower bounds arising from generalized Csiszár divergences
- Generalized wavelet Fisher's information of \(1 / f^\alpha\) signals
- The unique non self-referential \(q\)-canonical distribution and the physical temperature derived from the maximum entropy principle in Tsallis statistics
- Inequalities related to some types of entropies and divergences
- The confidence interval of q-Gaussian distributions
- On a (\(\beta, q)\)-generalized Fisher information and inequalities involving \(q\)-Gaussian distributions
- Geometry of distributions associated with Tsallis statistics and properties of relative entropy minimization
- Fisher information and its extensions based on infinite mixture density functions
- Entropy, Fisher Information and Variance with Frost-Musulin Potenial
- An axiomatic characterization of a two-parameter extended relative entropy
- Escort evolutionary game theory
- Maximum Entropy Reconstruction Using Derivative Information, Part 1: Fisher Information and Convex Duality
- Entropy -- a tale of ice and fire. (Review of some exceptional Tsallis indexes)
- Some properties of generalized Fisher information in the context of nonextensive thermostatistics
- Deriving partition functions and entropic functionals from thermodynamics
This page was built for publication: On the maximum entropy principle and the minimization of the Fisher information in Tsallis statistics
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3650872)