Rényi Divergence and Kullback-Leibler Divergence
From MaRDI portal
Publication:2986250
Abstract: R'enyi divergence is related to R'enyi entropy much like Kullback-Leibler divergence is related to Shannon's entropy, and comes up in many settings. It was introduced by R'enyi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends on a parameter that is called its order. In particular, the R'enyi divergence of order 1 equals the Kullback-Leibler divergence. We review and extend the most important properties of R'enyi divergence and Kullback-Leibler divergence, including convexity, continuity, limits of -algebras and the relation of the special order 0 to the Gaussian dichotomy and contiguity. We also show how to generalize the Pythagorean inequality to orders different from 1, and we extend the known equivalence between channel capacity and minimax redundancy to continuous channel inputs (for all orders) and present several other minimax results.
Cited in
(only showing first 100 items - show all)- The information-theoretic treatment of spinless particles with the assorted diatomic molecular potential
- \(\alpha\)-variational inference with statistical guarantees
- Learning models with uniform performance via distributionally robust optimization
- Hierarchical micro-macro acceleration for moment models of kinetic equations
- Conditionally structured variational Gaussian approximation with importance weights
- Contagion in financial systems: a Bayesian network approach
- Rényi relative entropy from homogeneous Kullback-Leibler divergence Lagrangian
- FL-MAC-RDP: federated learning over multiple access channels with Rényi differential privacy
- Communication-efficient and privacy-preserving large-scale federated learning counteracting heterogeneity
- The sphere packing bound for memoryless channels
- Studying the impact of fluctuations, spikes and rare events in time series through a wavelet entropy predictability measure
- Variational representations of annealing paths: Bregman information under monotonic embedding
- An active contour model for texture image segmentation using Rényi divergence measure
- Cramér-Rao lower bounds arising from generalized Csiszár divergences
- Bayesian fractional posteriors
- Exponential decay of Rényi divergence under Fokker-Planck equations
- Information aware max-norm Dirichlet networks for predictive uncertainty estimation
- Thermodynamic utility of non-Markovianity from the perspective of resource interconversion
- Framework based on communicability to measure the similarity of nodes in complex networks
- Conformal mirror descent with logarithmic divergences
- Convergence rates of variational posterior distributions
- Sensitivity analysis for rare events based on Rényi divergence
- Optimal information, Jensen-RIG function and \(\alpha\)-Onicescu's correlation coefficient in terms of information generating functions
- Local permutation tests for conditional independence
- Convergence rates of deep ReLU networks for multiclass classification
- Price probabilities: a class of Bayesian and non-Bayesian prediction rules
- Decay of convolved densities via Laplace transform
- The Augustin capacity and center
- Uncertainty, information, and disagreement of economic forecasters
- Variational representations and neural network estimation of Rényi divergences
- Analysis of Langevin Monte Carlo via convex optimization
- Mosaics of combinatorial designs for information-theoretic security
- A primer on alpha-information theory with application to leakage in secrecy systems
- General constructions of fuzzy extractors for continuous sources
- Two fractional order cumulative residual time series measures based on Rényi entropy
- The Kullback-Leibler divergence between lattice Gaussian distributions
- The right complexity measure in locally private estimation: it is not the Fisher information
- Optimal control of probabilistic Boolean control networks: A scalable infinite horizon approach
- Stressing dynamic loss models
- Information fractal dimension of mass function
- Optimal insurance under maxmin expected utility
- Poisson approximation in \(\chi^2\) distance by the Stein-Chen approach
- Fault-tolerant fusion using \(\alpha\)-Rényi divergence for autonomous vehicle localization
- Consistency and convergence rate of phylogenetic inference via regularization
- On the maximum values of \(f\)-divergence and Rényi divergence under a given variational distance
- On some optimization problems for the Rényi divergence
- Random matrix improved covariance estimation for a large class of metrics*
- Improved security proofs in lattice-based cryptography: using the Rényi divergence rather than the statistical distance
- Stability of the spectral gap and ground state indistinguishability for a decorated AKLT model
- Convexity and robustness of the Rényi entropy
- Conditions for the existence of a generalization of Rényi divergence
- On variational expressions for quantum relative entropies
- Fast near collision attack on the Grain v1 stream cipher
- On the privacy of noisy stochastic gradient descent for convex optimization
- scientific article; zbMATH DE number 7306861 (Why is no real title available?)
- The alpha-mixture of survival functions
- On the hardness of learning with rounding over small modulus
- Concentration of tempered posteriors and of their variational approximations
- A Study on Weighted Doubly Truncated Renyi Divergence
- Bit security as computational cost for winning games with high probability
- Quantum mechanics based on an extended least action principle and information metrics of vacuum fluctuations
- Rapid convergence of the unadjusted Langevin algorithm: isoperimetry suffices
- A variational formula for risk-sensitive reward
- Holographic second laws of black hole thermodynamics
- Converting information into probability measures with the Kullback-Leibler divergence
- G+G: a Fiat-Shamir lattice signature based on convolved Gaussians
- Simple threshold (fully homomorphic) encryption from LWE with polynomial modulus
- Sandwiched Rényi divergence satisfies data processing inequality
- On the equivalence of statistical distances for isotropic convex measures
- Operator-valued Schatten spaces and quantum entropies
- Infomax strategies for an optimal balance between exploration and exploitation
- Bayesian brains and the Rényi divergence
- Concentrated differentially private average consensus algorithm for a discrete-time network with heterogeneous dynamics
- ALESQP: An Augmented Lagrangian Equality-Constrained SQP Method for Optimization with General Constraints
- Training quantum neural networks using the quantum information bottleneck method
- A hyper-distance-based method for hypernetwork comparison
- Unified view for notions of bit security
- Taming numerical imprecision by adapting the KL divergence to negative probabilities
- Holographic Rényi relative divergence in JT gravity
- Towards classical hardness of module-LWE: the linear rank case
- A geometric variational approach to Bayesian inference
- Distance-based tests for planar shape
- scientific article; zbMATH DE number 7306871 (Why is no real title available?)
- Quantum scalar field theory based on an extended least action principle
- On the robustness of randomized classifiers to adversarial examples
- Entropy based risk measures
- A criterion on apportionment methods minimizing the Rényi's divergence
- Rényi relative entropies of quantum Gaussian states
- Swiveled Rényi entropies
- A simple derivation of the refined sphere packing bound under certain symmetry hypotheses
- Chain Rule Optimal Transport
- Asymptotics for Strassen's optimal transport problem
- Wasserstein-divergence transportation inequalities and polynomial concentration inequalities
- On variational inference and maximum likelihood estimation with the \(\lambda\)-exponential family
- A unified approach to the Pythagorean identity and projection theorem for a class of divergences based on M-estimations
- Chaining meets chain rule: multilevel entropic regularization and training of neural networks
- State convertibility in the von Neumann algebra framework
- Consistency of variational Bayes inference for estimation and model selection in mixtures
- The Kullback–Leibler Divergence Rate Between Markov Sources
- Concentrated differential privacy: simplifications, extensions, and lower bounds
This page was built for publication: Rényi Divergence and Kullback-Leibler Divergence
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2986250)