A new class of metric divergences on probability spaces and its applicability in statistics
From MaRDI portal
Publication:1881424
DOI10.1007/BF02517812zbMath1052.62002WikidataQ30051945 ScholiaQ30051945MaRDI QIDQ1881424
Igor Vajda, Ferdinand Österreicher
Publication date: 5 October 2004
Published in: Annals of the Institute of Statistical Mathematics (Search for Journal in Brave)
62G30: Order statistics; empirical distribution functions
62B10: Statistical aspects of information-theoretic topics
Related Items
Unnamed Item, Lectures on Entropy. I: Information-Theoretic Notions, Chain Rule Optimal Transport, Mixedf-divergence and inequalities for log-concave functions, Orlicz Addition for Measures and an Optimization Problem for the -divergence, Unnamed Item, Evaluation of the Copycat Model for Predicting Complex Network Growth, INEQUALITIES FOR QUANTUM f-DIVERGENCE OF CONVEX FUNCTIONS AND MATRICES, Performance of the One-Sample Goodness-of-Fit P–P-Plot Length Test, Optimal quantization of the support of a continuous multivariate distribution based on mutual information, A new quantum \(f\)-divergence for trace class operators in Hilbert spaces, Improved classification for compositional data using the \(\alpha\)-transformation, Implications of the Cressie-Read family of additive divergences for information recovery, Analyzing complex networks evolution through Information Theory quantifiers, From \(f\)-divergence to quantum quasi-entropies and their use, A Dirichlet regression model for compositional data with zeros, On some improvements of the Jensen inequality with some applications, Improving the accuracy of goodness-of-fit tests based on Rao's divergence with small sample size., Bayesian estimation of differential transcript usage from RNA-seq data, Universal distribution of batch completion times and time-cost tradeoff in a production line with arbitrary buffer size, Quantum metrics based upon classical Jensen-Shannon divergence, Entropic image segmentation of sessile drops over patterned acetate, Jensen-Shannon divergence and non-linear quantum dynamics, Divergence for \(s\)-concave and log concave functions, Unnamed Item, Selection rules based on divergences, Length of Time’s Arrow, Unnamed Item, Unnamed Item
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- No empirical probability measure can converge in the total variation sense for all distributions
- Goodness-of-fit statistics for discrete multivariate data
- I-divergence geometry of probability distributions and minimization problems
- An efficient and robust adaptive estimator of location
- Minimum Hellinger distance estimates for parametric models
- Efficiency versus robustness: The case for minimum Hellinger distance and related methods
- Minimum Hellinger distance point estimates consistent under weak family regularity
- Minimum disparity estimation for continuous models: Efficiency, distributions and robustness
- Formulae for mean integrated squared error of nonlinear wavelet-based density estimators
- Asymptotic divergence of estimates of discrete distributions
- Minimum Hellinger distance estimation in simple linear regression models; distribution and efficiency
- Minimum Kolmogorov distance estimates of parameters and parametrized distributions
- Distance and decision rules
- Divergence measures based on the Shannon entropy
- Length tests for goodness of fit
- Minimum Hellinger Distance Estimation for Multivariate Location and Covariance
- Distribution estimation consistent in total variation and in two types of information divergence
- Statistical information and discrimination
- Minimum Hellinger Distance Estimation for Finite Mixture Models
- About the asymptotic accuracy of Barron density estimates
- Asymptotic Normality ofL1-Error in Density Estimation
- Information-theoretical considerations on estimation problems
- Decision Rules, Based on the Distance, for Problems of Fit, Two Samples, and Estimation