Fisher information and its extensions based on infinite mixture density functions
From MaRDI portal
Publication:6175303
DOI10.1016/j.physa.2023.128959OpenAlexW4380091763MaRDI QIDQ6175303
Javier E. Contreras-Reyes, Omid Kharazmi, Hassan Jamali
Publication date: 21 July 2023
Published in: Physica A (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.physa.2023.128959
Fisher informationJensen-Fisher informationinfinite mixture density functionPearson-Vajda \(\chi^k\) divergenceskew-normal density
Cites Work
- A Mathematical Theory of Communication
- On the time-dependent Fisher information of a density function
- Some properties of generalized Fisher information in the context of nonextensive thermostatistics
- Chaotic systems with asymmetric heavy-tailed noise: application to 3D attractors
- A step beyond Tsallis and Rényi entropies
- On the Fisher information in record data
- Information quantity evaluation of nonlinear time series processes and applications
- Divergence measures based on the Shannon entropy
- On the maximum entropy principle and the minimization of the Fisher information in Tsallis statistics
- On scale mixtures of normal distributions
- Cumulative Residual and Relative Cumulative Residual Fisher Information and Their Properties
- Least kth-Order and Rényi Generative Adversarial Networks
- Moments of the Scores
- Mixture Models, Bayes Fisher Information, and Divergence Measures
- Extensions of Fisher Information and Stam's Inequality
- Jensen divergence based on Fisher’s information
- Elements of Information Theory
- On Information and Sufficiency
- Generating function for generalized Fisher information measure and its application to finite mixture models
- A general class of multivariate skew-elliptical distributions
- Source coding theorem based on a nonadditive information content
This page was built for publication: Fisher information and its extensions based on infinite mixture density functions