Law of Error in Tsallis Statistics
From MaRDI portal
Publication:3547684
DOI10.1109/TIT.2004.840862zbMATH Open1298.94041arXivcond-mat/0401540MaRDI QIDQ3547684FDOQ3547684
Authors: Hiroki Suyari, Makoto Tsukada
Publication date: 21 December 2008
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Abstract: Gauss' law of error is generalized in Tsallis statistics such as multifractal systems, in which Tsallis entropy plays an essential role instead of Shannon entropy. For the generalization, we apply the new multiplication operation determined by the q-logarithm and the q-exponential functions to the definition of the likelihood function in Gauss' law of error. The maximum likelihood principle leads us to finding Tsallis distribution as nonextensively generalization of Gaussian distribution.
Full work available at URL: https://arxiv.org/abs/cond-mat/0401540
Recommendations
- Tsallis distribution as a standard maximum entropy solution with `tail' constraint
- Foundations of nonextensive statistical mechanics
- Shannon, Lévy, and Tsallis: a note
- Remarks on the phenomenological Tsallis distributions and their link with the Tsallis statistics
- On the maximum entropy principle and the minimization of the Fisher information in Tsallis statistics
Statistical aspects of information-theoretic topics (62B10) Measures of information, entropy (94A17)
Cited In (17)
- Law of multiplicative error and its generalization to the correlated observations represented by the \(q\)-product
- The \(\kappa\)-generalizations of Stirling approximation and multinominal coefficients
- A risk profile for information fusion algorithms
- Remarks on the phenomenological Tsallis distributions and their link with the Tsallis statistics
- Nonextensive entropies derived from Gauss' principle
- Entropic forms and related algebras
- Deformed algebras and generalizations of independence on deformed exponential families
- Maximum-likelihood \(q\)-estimator uncovers the role of potassium at neuromuscular junctions
- Generalized probability functions
- Least informative distributions in maximum \(q\)-log-likelihood estimation
- Generalized log-likelihood functions and Bregman divergences
- Laws of large numbers for \(q\)-dependent random variables and nonextensive statistical mechanics
- Newton's cooling law in generalised statistical mechanics
- Robust full-waveform inversion using \(q\)-statistics
- Extensive and nonextensive statistics in seismic inversion
- Smoothed functional algorithms for stochastic optimization using \(q\)-Gaussian distributions
- On the maximum entropy principle and the minimization of the Fisher information in Tsallis statistics
This page was built for publication: Law of Error in Tsallis Statistics
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3547684)