Entropy expressions for multivariate continuous distributions
From MaRDI portal
Publication:4503552
DOI10.1109/18.825848zbMath0996.94018OpenAlexW2122378596MaRDI QIDQ4503552
Georges A. Darbellay, Igor Vajda
Publication date: 7 September 2000
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/18.825848
mutual informationdifferential entropyGamma-exponential distributionordered Weinman exponential distribution
Related Items (24)
Predictability of operational processes over finite horizon ⋮ Multivariate dynamic information ⋮ A maximum entropy characterization of symmetric Kotz type and Burr multivariate distribu\-tions ⋮ Distances between models of generalized order statistics ⋮ An application of copulas to OPEC’s changing influence on fossil fuel prices ⋮ Bivariate residual entropy function: A quantile approach ⋮ Minimum mutual information and non-Gaussianity through the maximum entropy method: theory and properties ⋮ Smoothed kernel estimation of bivariate residual entropy function ⋮ Comparison, utility, and partition of dependence under absolutely continuous and singular distributions ⋮ Formulas for Rényi information and related measures for univariate distributions. ⋮ Multivariate maximum entropy identification, transformation, and dependence ⋮ Application of data compression methods to nonparametric estimation of characteristics of discrete-time stochastic processes ⋮ Bivariate generalized cumulative residual entropy ⋮ Dimensionless Measures of Variability and Dependence for Multivariate Continuous Distributions ⋮ Expressions for Rényi and Shannon entropies for bivariate distributions ⋮ Shannon information in record data ⋮ Expressions for Rényi and Shannon entropies for multivariate distributions ⋮ Measuring stochastic dependence using \(\phi\)-divergence ⋮ Universal codes as a basis for time series testing ⋮ Maximum entropy characterizations of the multivariate Liouville distributions ⋮ Universal codes as a basis for nonparametric testing of serial independence for time series ⋮ Estimation of mutual information by the fuzzy histogram ⋮ Unnamed Item ⋮ Information measures of Dirichlet distribution with applications
This page was built for publication: Entropy expressions for multivariate continuous distributions