Measuring stochastic dependence using \(\phi\)-divergence
From MaRDI portal
Publication:2489782
DOI10.1016/j.jmva.2005.04.007zbMath1085.62077OpenAlexW1995026624MaRDI QIDQ2489782
Athanasios C. Micheas, Konstantinos G. Zografos
Publication date: 28 April 2006
Published in: Journal of Multivariate Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jmva.2005.04.007
Multivariate analysis (62H99) Hypothesis testing in multivariate analysis (62H15) Measures of association (correlation, canonical correlation, etc.) (62H20) Monte Carlo methods (65C05) Statistical aspects of information-theoretic topics (62B10)
Related Items (13)
Normalizing Random Vector Anisotropy Magnitude ⋮ EVALUATION OF MUTUAL INFORMATION ESTIMATORS FOR TIME SERIES ⋮ Elaboration Models with Symmetric Information Divergence ⋮ A nonparametric two‐sample test using a general φ‐divergence‐based mutual information ⋮ Comparison, utility, and partition of dependence under absolutely continuous and singular distributions ⋮ On reconsidering entropies and divergences and their cumulative counterparts: Csiszár's, DPD's and Fisher's type cumulative and survival measures ⋮ The Hellinger Correlation ⋮ Discrimination among bivariate beta-generated distributions ⋮ Entropy measure for the quantification of upper quantile interdependence in multivariate distributions ⋮ On some entropy and divergence type measures of variability and dependence for mixed continuous and discrete variables ⋮ Mining and visualising ordinal data with non-parametric continuous BBNs ⋮ A measure of mutual complete dependence ⋮ A note on the notion of informative composite density
Cites Work
- Superadditivity of Fisher's information and logarithmic Sobolev inequalities
- On nonparametric measures of dependence for random variables
- An asymptotic test of independence for multivariate \(t\) and Cauchy random variables with applications
- Measures of multivariate dependence based on a distance between Fisher information matrices
- Distance and decision rules
- On the f-divergence and singularity of probability measures
- An informational measure of correlation
- On measures of dependence
- Information gain and a general measure of correlation
- Relative Entropy Measures of Multivariate Dependence
- A general correlation coefficient for directional data and related regression problems
- On a measure of dependence based on fisher's information matrix
- Measures of Dependence and Tests of Independence
- Entropy expressions for multivariate continuous distributions
- Measures of Asociation between vectors
- On measures of statistical dependence
- Mutual Information and Maximal Correlation as Measures of Dependence
- On Information and Sufficiency
- Decision Rules, Based on the Distance, for Problems of Fit, Two Samples, and Estimation
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Measuring stochastic dependence using \(\phi\)-divergence