Shannon entropy and Kullback–Leibler divergence in multivariate log fundamental skew‐normal and related distributions
DOI10.1002/cjs.11285zbMath1357.62033arXiv1408.4755OpenAlexW2963521009MaRDI QIDQ5507356
Marina M. de Queiroz, Roger W. C. Silva, Rosangela H. Loschi
Publication date: 19 December 2016
Published in: Canadian Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1408.4755
Applications of statistics to environmental and related topics (62P12) Bayesian inference (62F15) Characterization and structure theory for multivariate probability distributions; copulas (62H05) Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- Kullback-Leibler divergence measure for multivariate skew-normal distributions
- Parametric Bayesian estimation of differential entropy and relative entropy
- Parameter interpretation in skewed logistic regression with random intercept
- On fundamental skew distributions
- Information Theory and Statistical Mechanics
- Entropy expressions and their estimators for multivariate distributions
- The multivariate skew-normal distribution
- Conjugate Priors Represent Strong Pre-Experimental Assumptions
- Shannon Entropy and Mutual Information for Multivariate Skew‐Elliptical Distributions
- A minimally informative likelihood for decision analysis: Illustration and robustness
- Elements of Information Theory
- Prior Probabilities
- Multivariate log-skewed distributions with normal kernel and their applications
- On Information and Sufficiency
This page was built for publication: Shannon entropy and Kullback–Leibler divergence in multivariate log fundamental skew‐normal and related distributions