On a (β, q)-generalized Fisher information and inequalities involving q-Gaussian distributions
From MaRDI portal
Publication:2865448
DOI10.1063/1.4726197zbMath1278.82025arXiv1203.1435OpenAlexW3101287047MaRDI QIDQ2865448
Publication date: 29 November 2013
Published in: Journal of Mathematical Physics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1203.1435
Cramér-Rao inequalityFisher informationStam inequalityTsallis entropyRényi entropy\(q\)-Gaussian distributions
Related Items
On statistical properties of Jizba-Arimitsu hybrid entropy, Use of the geometric mean as a statistic for the scale of the coupled Gaussian distributions, Logarithmic Sobolev and Shannon's inequalities and an application to the uncertainty principle, Some properties of generalized Fisher information in the context of nonextensive thermostatistics, Rescaling the nonadditivity parameter in Tsallis thermostatistics, Cramér-Rao lower bounds arising from generalized Csiszár divergences
Cites Work
- Unnamed Item
- Unnamed Item
- Sharp Gagliardo-Nirenberg inequalities via \(p\)-Laplacian type equations
- Existence and uniqueness of solutions for the Lane, Emden and Fowler type problem
- Generalised exponential families and associated entropy functions
- \(q\)-Gaussians in the porous-medium equation: stability and time evolution
- The general optimal \(L^{p}\)-Euclidean logarithmic Sobolev inequality by Hamilton--Jacobi equations.
- Best constants for Gagliardo-Nirenberg inequalities and applications to nonlinear diffusions.
- The calculus of variations
- The optimal Euclidean \(L^{p}\)-Sobolev logarithmic inequality.
- Possible generalization of Boltzmann-Gibbs statistics.
- Some inequalities satisfied by the quantities of information of Fisher and Shannon
- Information geometry ofq-Gaussian densities and behaviors of solutions to related diffusion equations
- Introduction to Nonextensive Statistical Mechanics
- CramÉr–Rao and Moment-Entropy Inequalities for Renyi Entropy and Generalized Fisher Information
- Moment-Entropy Inequalities for a Random Vector
- On the maximum entropy principle and the minimization of the Fisher information in Tsallis statistics
- A Nonlinear Singular Boundary Value Problem in the Theory of Pseudoplastic Fluids
- Information theoretic inequalities
- Science from Fisher Information
- Logarithmic Sobolev Inequalities for Information Measures
- Extensions of Fisher Information and Stam's Inequality
- Robust Statistics
- Fisher information in a Tsallis non-extensive environment