Measuring stochastic dependence using \(\phi\)-divergence
From MaRDI portal
Publication:2489782
DOI10.1016/j.jmva.2005.04.007zbMath1085.62077MaRDI QIDQ2489782
Konstantinos G. Zografos, Athanasios C. Micheas
Publication date: 28 April 2006
Published in: Journal of Multivariate Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jmva.2005.04.007
62H99: Multivariate analysis
62H15: Hypothesis testing in multivariate analysis
62H20: Measures of association (correlation, canonical correlation, etc.)
65C05: Monte Carlo methods
62B10: Statistical aspects of information-theoretic topics
Related Items
EVALUATION OF MUTUAL INFORMATION ESTIMATORS FOR TIME SERIES, On some entropy and divergence type measures of variability and dependence for mixed continuous and discrete variables, Mining and visualising ordinal data with non-parametric continuous BBNs, A measure of mutual complete dependence
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Superadditivity of Fisher's information and logarithmic Sobolev inequalities
- On nonparametric measures of dependence for random variables
- An asymptotic test of independence for multivariate \(t\) and Cauchy random variables with applications
- Measures of multivariate dependence based on a distance between Fisher information matrices
- Distance and decision rules
- On the f-divergence and singularity of probability measures
- An informational measure of correlation
- On measures of dependence
- Information gain and a general measure of correlation
- Relative Entropy Measures of Multivariate Dependence
- A general correlation coefficient for directional data and related regression problems
- On a measure of dependence based on fisher's information matrix
- Measures of Dependence and Tests of Independence
- Entropy expressions for multivariate continuous distributions
- Measures of Asociation between vectors
- Mutual Information and Maximal Correlation as Measures of Dependence
- On Information and Sufficiency
- Decision Rules, Based on the Distance, for Problems of Fit, Two Samples, and Estimation