On Divergences and Informations in Statistics and Information Theory
From MaRDI portal
Publication:3547915
DOI10.1109/TIT.2006.881731zbMath1287.94025WikidataQ57424218 ScholiaQ57424218MaRDI QIDQ3547915
Publication date: 21 December 2008
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Related Items
Estimation with infinite-dimensional exponential family and Fisher divergence ⋮ Divergences on monads for relational program logics ⋮ The right complexity measure in locally private estimation: it is not the Fisher information ⋮ Stackelberg differential game for insurance under model ambiguity: general divergence ⋮ Aggregated tests based on supremal divergence estimators for non-regular statistical models ⋮ On local divergences between two probability measures ⋮ Optimal quantization of the support of a continuous multivariate distribution based on mutual information ⋮ Thermodynamic Bayesian model comparison ⋮ Solving high-dimensional Hamilton-Jacobi-Bellman PDEs using neural networks: perspectives from the theory of controlled diffusions and measures on path space ⋮ A brief review of linear sufficient dimension reduction through optimization ⋮ Path-Based Divergence Rates and Lagrangian Uncertainty in Stochastic Flows ⋮ Applications of entropy in finance: a review ⋮ Unnamed Item ⋮ Floating bodies and approximation of convex bodies by polytopes ⋮ Minimal sufficient positive-operator valued measure on a separable Hilbert space ⋮ A unified approach to sufficient dimension reduction ⋮ Complete entropic inequalities for quantum Markov chains ⋮ Gradient and passive circuit structure in a class of non-linear dynamics on a graph ⋮ Perspective functions: properties, constructions, and examples ⋮ Existence, consistency and computer simulation for selected variants of minimum distance estimators ⋮ Overlap in observational studies with high-dimensional covariates ⋮ On the properties that characterize privacy ⋮ A refinement and an exact equality condition for the basic inequality of f-divergences ⋮ Dual divergence estimators and tests: robustness results ⋮ Entropy based risk measures ⋮ Ensemble Markov Chain Monte Carlo with Teleporting Walkers ⋮ Discussion on article ``Bayesian inference with misspecified models ⋮ Minimum divergence estimators, maximum likelihood and exponential families ⋮ Curvature functionals on convex bodies ⋮ Elaboration Models with Symmetric Information Divergence ⋮ Unrestricted information acquisition ⋮ On the maximum values of \(f\)-divergence and Rényi divergence under a given variational distance ⋮ SLISEMAP: supervised dimensionality reduction through local explanations ⋮ An analytically solvable principal-agent model ⋮ Optimal shrinkage estimation of predictive densities under \(\alpha\)-divergences ⋮ Refinements of discrete and integral Jensen inequalities with Jensen's gap ⋮ Dual divergences estimation for censored survival data ⋮ Robust Statistical Engineering by Means of Scaled Bregman Distances ⋮ Fredholm integral relation between compound estimation and prediction (FIRCEP) ⋮ Optimal entropy-transport problems and a new Hellinger-Kantorovich distance between positive measures ⋮ Rényi divergence and \(L_p\)-affine surface area for convex bodies ⋮ Optimal insurance under maxmin expected utility ⋮ Dual divergence estimators of the tail index ⋮ The Hellinger Correlation ⋮ Decomposable pseudodistances and applications in statistical estimation ⋮ Technical Note—The Joint Impact ofF-Divergences and Reference Models on the Contents of Uncertainty Sets ⋮ Combining marginal probability distributions via minimization of weighted sum of Kullback-Leibler divergences ⋮ Divergence for \(s\)-concave and log concave functions ⋮ A new class of metrics for learning on real-valued and structured data ⋮ A combined MAP and Bayesian scheme for finite data and/or moving horizon estimation ⋮ From \(f\)-divergence to quantum quasi-entropies and their use ⋮ On Bayesian estimation via divergences ⋮ Causality and Bayesian network PDEs for multiscale representations of porous media ⋮ Tests of goodness of fit based on Phi-divergence ⋮ Variational regularisation for inverse problems with imperfect forward operators and general noise models ⋮ Shape constrained density estimation via penalized Rényi divergence ⋮ On testing local hypotheses via local divergence ⋮ Sensitivity analysis with \(\chi^2\)-divergences ⋮ Analyzing anonymity attacks through noisy channels ⋮ On divergences of finite measures and their applicability in statistics and information theory ⋮ Quantifying ambiguity bounds via time-consistent sets of indistinguishable models ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Lagrangian Uncertainty Quantification and Information Inequalities for Stochastic Flows ⋮ Divergence-Based Vector Quantization ⋮ Multiclass classification, information, divergence and surrogate risk ⋮ Relative and Discrete Utility Maximising Entropy ⋮ Variational Representations and Neural Network Estimation of Rényi Divergences ⋮ Contraction and regularizing properties of heat flows in metric measure spaces ⋮ The minimum increment of f-divergences given total variation distances ⋮ Parametric estimation and tests through divergences and the duality technique ⋮ Statistical inference on random dot product graphs: a survey ⋮ Mixedf-divergence and inequalities for log-concave functions ⋮ A Steiner formula in the \(L_p\) Brunn Minkowski theory ⋮ Towards a better understanding of the dual representation of phi divergences ⋮ Law invariant risk measures and information divergences ⋮ Least-Squares Independent Component Analysis ⋮ Several Applications of Divergence Criteria in Continuous Families ⋮ Limit theorems for eigenvectors of the normalized Laplacian for random graphs ⋮ Uncertainty Quantification for Markov Processes via Variational Principles and Functional Inequalities ⋮ Local modulated wave model for the reconstruction of space–time energy spectra in turbulent flows ⋮ Limit Theorems for φ-Divergences Based onk-Spacings ⋮ A novel nonparametric distance estimator for densities with error bounds ⋮ Some Universal Insights on Divergences for Statistics, Machine Learning and Artificial Intelligence ⋮ Imitation Learning as f-Divergence Minimization ⋮ Orlicz Addition for Measures and an Optimization Problem for the -divergence ⋮ Unnamed Item ⋮ QUANTUM f-DIVERGENCES AND ERROR CORRECTION ⋮ Unnamed Item ⋮ Unnamed Item ⋮ General bootstrap for dual \(\phi\)-divergence estimates ⋮ Refinements of the integral Jensen's inequality generated by finite or infinite permutations ⋮ A simplified and unified generalization of some majorization results ⋮ Selection rules based on divergences ⋮ FROM QUASI-ENTROPY TO SKEW INFORMATION ⋮ Comparison of contraction coefficients for \(f\)-divergences ⋮ Coherence quantifiers from the viewpoint of their decreases in the measurement process ⋮ Quantification of model uncertainty on path-spaceviagoal-oriented relative entropy ⋮ On divergence tests for composite hypotheses under composite likelihood ⋮ Unnamed Item ⋮ Sup-sums principles for \(F\)-divergence and a new definition for \(t\)-entropy ⋮ New estimates and tests of independence in some copula models ⋮ Image segmentation using level set driven by generalized divergence ⋮ Asymptotics of smoothed Wasserstein distances