Mixture Models, Bayes Fisher Information, and Divergence Measures
From MaRDI portal
Publication:5223934
DOI10.1109/TIT.2018.2877608zbMATH Open1431.94040OpenAlexW2898129670WikidataQ129049103 ScholiaQ129049103MaRDI QIDQ5223934FDOQ5223934
Authors: Nader Ebrahimi, Omid Kharazmi, Ehsan S. Soofi, M. Asadi
Publication date: 19 July 2019
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/tit.2018.2877608
Statistical aspects of information-theoretic topics (62B10) Measures of information, entropy (94A17)
Cited In (10)
- Jensen-information generating function and its connections to some well-known information measures
- Fisher information and its extensions based on infinite mixture density functions
- Extropy: Characterizations and dynamic versions
- The alpha-mixture of survival functions
- Ordering results between two multiple-outlier finite \(\delta\)-mixtures
- Generating function for generalized Fisher information measure and its application to finite mixture models
- Stochastic comparison results between two finite mixture models with generalized Weibull distributed components
- Stochastic comparisons for finite mixtures from location-scale family of distributions
- Log-mean distribution: applications to medical data, survival regression, Bayesian and non-Bayesian discussion with MCMC algorithm
- On stochastic comparisons of finite \(\alpha \)-mixture models
This page was built for publication: Mixture Models, Bayes Fisher Information, and Divergence Measures
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5223934)