On local divergences between two probability measures
From MaRDI portal
Recommendations
- A generalization of local divergence measures
- On metric divergences of probability measures
- Generalizated local divergence measures
- Some general divergence measures for probability distributions
- On Bregman Distances and Divergences of Probability Measures
- On testing local hypotheses via local divergence
- Divergence measures and a general framework for local variational approximation
- Formulation and properties of a divergence used to compare probability measures without absolute continuity
- On \(f\)-divergence for \(\sigma \)-\(\oplus \)-measures
- On divergences of finite measures and their applicability in statistics and information theory
Cites work
- scientific article; zbMATH DE number 3143969 (Why is no real title available?)
- scientific article; zbMATH DE number 3908323 (Why is no real title available?)
- scientific article; zbMATH DE number 3911472 (Why is no real title available?)
- scientific article; zbMATH DE number 4072103 (Why is no real title available?)
- scientific article; zbMATH DE number 53182 (Why is no real title available?)
- scientific article; zbMATH DE number 3252891 (Why is no real title available?)
- scientific article; zbMATH DE number 3322635 (Why is no real title available?)
- scientific article; zbMATH DE number 2221907 (Why is no real title available?)
- scientific article; zbMATH DE number 3200971 (Why is no real title available?)
- A local spectral approach for assessing time series model misspecification
- Divergence-based estimation and testing with misclassified data
- Entropy, divergence and distance measures with econometric applications
- Expressions for Rényi and Shannon entropies for multivariate distributions
- Goodness of fit tests with weights in the classes based on \((h,\phi )\)-divergences.
- Information indices: Unification and applications.
- Minimum (h,)-divergences estimators with weights
- On Divergences and Informations in Statistics and Information Theory
- On Information and Sufficiency
- On Mardia's and Song's measures of kurtosis in elliptical distributions
- On divergences of finite measures and their applicability in statistics and information theory
- On the f-divergence and singularity of probability measures
- Principal Information Theoretic Approaches
- Rényi Statistics in Directed Families of Exponential Experiments*
- Statistical Inference
- Theory of statistical inference and information. Transl. from the Slovak by the author
Cited in
(6)- Some universal insights on divergences for statistics, machine learning and artificial intelligence
- On testing local hypotheses via local divergence
- Local divergence and association
- On reconsidering entropies and divergences and their cumulative counterparts: Csiszár's, DPD's and Fisher's type cumulative and survival measures
- A generalization of local divergence measures
- A criterion for local model selection
This page was built for publication: On local divergences between two probability measures
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q263904)