(f,)-divergences: interpolating between f-divergences and integral probability metrics
From MaRDI portal
Publication:5054641
Authors: Jeremiah Birrell, Paul Dupuis, Markos Katsoulakis, Yannis Pantazis, Luc Rey-Bellet
Publication date: 29 November 2022
Full work available at URL: https://arxiv.org/abs/2011.05953
Recommendations
- Optimal bounds between \(f\)-divergences and integral probability metrics
- Formulation and properties of a divergence used to compare probability measures without absolute continuity
- Transport information Bregman divergences
- On the empirical estimation of integral probability metrics
- A new class of metric divergences on probability spaces and its applicability in statistics
Cites Work
- Title not available (Why is that?)
- On Information and Sufficiency
- On the empirical estimation of integral probability metrics
- Optimal transport for applied mathematicians. Calculus of variations, PDEs, and modeling
- Support Vector Machines
- Integral Probability Metrics and Their Generating Classes of Functions
- Universality, Characteristic Kernels and RKHS Embedding of Measures
- A kernel two-sample test
- Optimal Transport
- Title not available (Why is that?)
- Convex Analysis
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- On Divergences and Informations in Statistics and Information Theory
- Formulation and properties of a divergence used to compare probability measures without absolute continuity
- Title not available (Why is that?)
- Title not available (Why is that?)
- Multilayer feedforward networks are universal approximators
- Approximation by superpositions of a sigmoidal function
- Another Proof that Convex Functions are Locally Lipschitz
- Estimating Divergence Functionals and the Likelihood Ratio by Convex Risk Minimization
- Title not available (Why is that?)
- AN OLD‐NEW CONCEPT OF CONVEX RISK MEASURES: THE OPTIMIZED CERTAINTY EQUIVALENT
- Calculation of the Wasserstein Distance Between Probability Distributions on the Line
- Duality in Vector Optimization
- Minimization of φ-divergences on sets of signed measures
- Divergence Estimation of Continuous Distributions Based on Data-Dependent Partitions
- Uniform Central Limit Theorems
- Robust risk measurement and model risk
- Measuring distribution model risk
- Path-space information bounds for uncertainty quantification and sensitivity analysis of stochastic dynamics
- Robust sensitivity analysis for stochastic systems
- Robust bounds on risk-sensitive functionals via Rényi divergence
- Distinguishing and integrating aleatoric and epistemic variation in uncertainty quantification
- How Biased Is Your Model? Concentration Inequalities, Information and Model Bias
- Convergence rate of \(\mathcal{O}(1/k)\) for optimistic gradient and extragradient methods in smooth convex-concave saddle point problems
- Sensitivity analysis for rare events based on Rényi divergence
- Variational representations and neural network estimation of Rényi divergences
- Optimizing Variational Representations of Divergences and Accelerating Their Statistical Estimation
Cited In (5)
- Conditional sampling with monotone GANs: from generative models to likelihood-free inference
- Aggregated tests based on supremal divergence estimators for non-regular statistical models
- Model Uncertainty and Correctability for Directed Graphical Models
- Lipschitz-regularized gradient flows and generative particle algorithms for high-dimensional scarce data
- Wasserstein-based fairness interpretability framework for machine learning models
Uses Software
This page was built for publication: \((f,\Gamma)\)-divergences: interpolating between \(f\)-divergences and integral probability metrics
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5054641)