A new class of metric divergences on probability spaces and its applicability in statistics
From MaRDI portal
Publication:1881424
DOI10.1007/BF02517812zbMath1052.62002OpenAlexW2063703901WikidataQ30051945 ScholiaQ30051945MaRDI QIDQ1881424
Igor Vajda, Ferdinand Österreicher
Publication date: 5 October 2004
Published in: Annals of the Institute of Statistical Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf02517812
Order statistics; empirical distribution functions (62G30) Statistical aspects of information-theoretic topics (62B10)
Related Items
Optimal quantization of the support of a continuous multivariate distribution based on mutual information ⋮ Unnamed Item ⋮ A new quantum \(f\)-divergence for trace class operators in Hilbert spaces ⋮ Jensen-Shannon divergence and non-linear quantum dynamics ⋮ Monoparametric family of metrics derived from classical Jensen-Shannon divergence ⋮ Improved classification for compositional data using the \(\alpha\)-transformation ⋮ A thermodynamical derivation of the quantum potential and the temperature of the wave function ⋮ Bayesian estimation of differential transcript usage from RNA-seq data ⋮ Implications of the Cressie-Read family of additive divergences for information recovery ⋮ Analyzing complex networks evolution through Information Theory quantifiers ⋮ A review of compositional data analysis and recent advances ⋮ Divergence for \(s\)-concave and log concave functions ⋮ From \(f\)-divergence to quantum quasi-entropies and their use ⋮ Entropic image segmentation of sessile drops over patterned acetate ⋮ Chain Rule Optimal Transport ⋮ Mixedf-divergence and inequalities for log-concave functions ⋮ Performance of the One-Sample Goodness-of-Fit P–P-Plot Length Test ⋮ A Dirichlet regression model for compositional data with zeros ⋮ Universal distribution of batch completion times and time-cost tradeoff in a production line with arbitrary buffer size ⋮ Unnamed Item ⋮ Lectures on Entropy. I: Information-Theoretic Notions ⋮ Orlicz Addition for Measures and an Optimization Problem for the -divergence ⋮ Improving the accuracy of goodness-of-fit tests based on Rao's divergence with small sample size. ⋮ Unnamed Item ⋮ On some improvements of the Jensen inequality with some applications ⋮ Selection rules based on divergences ⋮ Length of Time’s Arrow ⋮ Unnamed Item ⋮ Evaluation of the Copycat Model for Predicting Complex Network Growth ⋮ Unnamed Item ⋮ INEQUALITIES FOR QUANTUM f-DIVERGENCE OF CONVEX FUNCTIONS AND MATRICES ⋮ Quantum metrics based upon classical Jensen-Shannon divergence
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- No empirical probability measure can converge in the total variation sense for all distributions
- Goodness-of-fit statistics for discrete multivariate data
- I-divergence geometry of probability distributions and minimization problems
- An efficient and robust adaptive estimator of location
- Minimum Hellinger distance estimates for parametric models
- Efficiency versus robustness: The case for minimum Hellinger distance and related methods
- Minimum Hellinger distance point estimates consistent under weak family regularity
- Minimum disparity estimation for continuous models: Efficiency, distributions and robustness
- Formulae for mean integrated squared error of nonlinear wavelet-based density estimators
- Asymptotic divergence of estimates of discrete distributions
- Minimum Hellinger distance estimation in simple linear regression models; distribution and efficiency
- Minimum Kolmogorov distance estimates of parameters and parametrized distributions
- Distance and decision rules
- Divergence measures based on the Shannon entropy
- Length tests for goodness of fit
- Minimum Hellinger Distance Estimation for Multivariate Location and Covariance
- Distribution estimation consistent in total variation and in two types of information divergence
- Statistical information and discrimination
- Minimum Hellinger Distance Estimation for Finite Mixture Models
- About the asymptotic accuracy of Barron density estimates
- Asymptotic Normality ofL1-Error in Density Estimation
- Information-theoretical considerations on estimation problems
- Decision Rules, Based on the Distance, for Problems of Fit, Two Samples, and Estimation