Rényi Divergence and Kullback-Leibler Divergence
From MaRDI portal
Publication:2986250
DOI10.1109/TIT.2014.2320500zbMATH Open1360.94180arXiv1206.2459OpenAlexW2026653933WikidataQ59408941 ScholiaQ59408941MaRDI QIDQ2986250FDOQ2986250
Authors: Tim van Erven, Peter Harremoës
Publication date: 16 May 2017
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Abstract: R'enyi divergence is related to R'enyi entropy much like Kullback-Leibler divergence is related to Shannon's entropy, and comes up in many settings. It was introduced by R'enyi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends on a parameter that is called its order. In particular, the R'enyi divergence of order 1 equals the Kullback-Leibler divergence. We review and extend the most important properties of R'enyi divergence and Kullback-Leibler divergence, including convexity, continuity, limits of -algebras and the relation of the special order 0 to the Gaussian dichotomy and contiguity. We also show how to generalize the Pythagorean inequality to orders different from 1, and we extend the known equivalence between channel capacity and minimax redundancy to continuous channel inputs (for all orders) and present several other minimax results.
Full work available at URL: https://arxiv.org/abs/1206.2459
Cited In (only showing first 100 items - show all)
- Conditionally structured variational Gaussian approximation with importance weights
- Learning models with uniform performance via distributionally robust optimization
- The sphere packing bound for memoryless channels
- Information aware max-norm Dirichlet networks for predictive uncertainty estimation
- Cramér-Rao lower bounds arising from generalized Csiszár divergences
- Exponential decay of Rényi divergence under Fokker-Planck equations
- Bayesian fractional posteriors
- Framework based on communicability to measure the similarity of nodes in complex networks
- Convergence rates of variational posterior distributions
- Uncertainty, information, and disagreement of economic forecasters
- Price probabilities: a class of Bayesian and non-Bayesian prediction rules
- Analysis of Langevin Monte Carlo via convex optimization
- Optimal insurance under maxmin expected utility
- Poisson approximation in \(\chi^2\) distance by the Stein-Chen approach
- On the maximum values of \(f\)-divergence and Rényi divergence under a given variational distance
- On some optimization problems for the Rényi divergence
- Improved security proofs in lattice-based cryptography: using the Rényi divergence rather than the statistical distance
- Convexity and robustness of the Rényi entropy
- The alpha-mixture of survival functions
- On variational expressions for quantum relative entropies
- Fast near collision attack on the Grain v1 stream cipher
- On the hardness of learning with rounding over small modulus
- Concentration of tempered posteriors and of their variational approximations
- A variational formula for risk-sensitive reward
- Holographic second laws of black hole thermodynamics
- Sandwiched Rényi divergence satisfies data processing inequality
- Operator-valued Schatten spaces and quantum entropies
- Converting information into probability measures with the Kullback-Leibler divergence
- Infomax strategies for an optimal balance between exploration and exploitation
- A geometric variational approach to Bayesian inference
- Holographic Rényi relative divergence in JT gravity
- Towards classical hardness of module-LWE: the linear rank case
- Distance-based tests for planar shape
- On the robustness of randomized classifiers to adversarial examples
- Entropy based risk measures
- Rényi relative entropies of quantum Gaussian states
- A criterion on apportionment methods minimizing the Rényi's divergence
- Swiveled Rényi entropies
- Wasserstein-divergence transportation inequalities and polynomial concentration inequalities
- The Kullback–Leibler Divergence Rate Between Markov Sources
- Consistency of variational Bayes inference for estimation and model selection in mixtures
- Concentrated differential privacy: simplifications, extensions, and lower bounds
- Uniqueness and characterization theorems for generalized entropies
- Rényi divergence and the central limit theorem
- Infinite-dimensional gradient-based descent for alpha-divergence minimisation
- Rényi's divergence as a chemical similarity criterion
- ordpy: a Python package for data analysis with permutation entropy and ordinal network methods
- Limits on the efficiency of (ring) LWE based non-interactive key exchange
- Fast rates for general unbounded loss functions: from ERM to generalized Bayes
- On dynamic pricing under model uncertainty
- Scalable information inequalities for uncertainty quantification
- Common Information, Noise Stability, and Their Extensions
- Optimal experimental design for prediction based on push-forward probability measures
- Robust mean variance optimization problem under Rényi divergence information
- Rényi divergence minimization based co-regularized multiview clustering
- Rényi divergences from Euclidean quenches
- Measuring diversity in heterogeneous information networks
- On stochastic comparisons of finite \(\alpha \)-mixture models
- Multiple-source adaptation theory and algorithms
- Characterization of time series via Rényi complexity-entropy curves
- Ordering on the probability simplex of endmembers for hyperspectral morphological image processing
- Logarithmic divergences from optimal transport and Rényi geometry
- Sandwiched Rényi relative entropy on density operators
- Transport-majorization to analytic and geometric inequalities
- Hierarchical micro-macro acceleration for moment models of kinetic equations
- Zipf-Mandelbrot law, \(f\)-divergences and the Jensen-type interpolating inequalities
- The information-theoretic treatment of spinless particles with the assorted diatomic molecular potential
- Label distribution learning by regularized sample self-representation
- Contagion in financial systems: a Bayesian network approach
- Sub-domain adaptation learning methodology
- \(\alpha\)-variational inference with statistical guarantees
- Studying the impact of fluctuations, spikes and rare events in time series through a wavelet entropy predictability measure
- Variational representations of annealing paths: Bregman information under monotonic embedding
- Rényi relative entropy from homogeneous Kullback-Leibler divergence Lagrangian
- FL-MAC-RDP: federated learning over multiple access channels with Rényi differential privacy
- An active contour model for texture image segmentation using Rényi divergence measure
- Thermodynamic utility of non-Markovianity from the perspective of resource interconversion
- Conformal mirror descent with logarithmic divergences
- Sensitivity analysis for rare events based on Rényi divergence
- Decay of convolved densities via Laplace transform
- Optimal information, Jensen-RIG function and \(\alpha\)-Onicescu's correlation coefficient in terms of information generating functions
- Local permutation tests for conditional independence
- Convergence rates of deep ReLU networks for multiclass classification
- Variational representations and neural network estimation of Rényi divergences
- The Augustin capacity and center
- General constructions of fuzzy extractors for continuous sources
- Two fractional order cumulative residual time series measures based on Rényi entropy
- The Kullback-Leibler divergence between lattice Gaussian distributions
- The right complexity measure in locally private estimation: it is not the Fisher information
- Optimal control of probabilistic Boolean control networks: A scalable infinite horizon approach
- Stressing dynamic loss models
- Mosaics of combinatorial designs for information-theoretic security
- A primer on alpha-information theory with application to leakage in secrecy systems
- Information fractal dimension of mass function
- Fault-tolerant fusion using \(\alpha\)-Rényi divergence for autonomous vehicle localization
- Stability of the spectral gap and ground state indistinguishability for a decorated AKLT model
- Random matrix improved covariance estimation for a large class of metrics*
- Consistency and convergence rate of phylogenetic inference via regularization
- On the privacy of noisy stochastic gradient descent for convex optimization
- Title not available (Why is that?)
This page was built for publication: Rényi Divergence and Kullback-Leibler Divergence
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2986250)