Estimating Divergence Functionals and the Likelihood Ratio by Convex Risk Minimization
DOI10.1109/TIT.2010.2068870zbMATH Open1366.62071arXiv0809.0853MaRDI QIDQ5281236FDOQ5281236
Authors: Xuanlong Nguyen, Martin J. Wainwright, Michael Jordan
Publication date: 27 July 2017
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0809.0853
Nonparametric estimation (62G05) Nonparametric regression and quantile regression (62G08) Asymptotic properties of nonparametric inference (62G20) Convex programming (90C25)
Cited In (56)
- Variational representations of annealing paths: Bregman information under monotonic embedding
- Optimal experimental design: formulations and computations
- Learning under nonstationarity: covariate shift and class-balance change
- Non-parametric two-sample tests: recent developments and prospects
- Robust Validation: Confident Predictions Even When Distributions Shift
- Variational Bayesian optimal experimental design with normalizing flows
- Adaptive joint distribution learning
- Direct density-ratio estimation with dimensionality reduction via least-squares hetero-distributional subspace search
- Formulation and properties of a divergence used to compare probability measures without absolute continuity
- Direct Learning of Sparse Changes in Markov Networks by Density Ratio Estimation
- Geometry of EM and related iterative algorithms
- On distributionally robust extreme value analysis
- Solving inverse stochastic problems from discrete particle observations using the Fokker-Planck equation and physics-informed neural networks
- On the empirical estimation of integral probability metrics
- Nonparametric estimation of Kullback-Leibler divergence
- Variational representations and neural network estimation of Rényi divergences
- A Deep Generative Approach to Conditional Sampling
- Non-parametric estimation of mutual information through the entropy of the linkage
- A Monte Carlo approach to quantifying model error in Bayesian parameter estimation
- Level sets semimetrics for probability measures with applications in hypothesis testing
- Change-point detection in time-series data by relative density-ratio estimation
- Robust Actuarial Risk Analysis
- Online direct density-ratio estimation applied to inlier-based outlier detection
- Relative Density-Ratio Estimation for Robust Distribution Comparison
- Statistical analysis of distance estimators with density differences and density ratios
- Minimum Divergence, Generalized Empirical Likelihoods, and Higher Order Expansions
- Statistical analysis of kernel-based least-squares density-ratio estimation
- Title not available (Why is that?)
- Convergence of latent mixing measures in finite and infinite mixture models
- Aggregated tests based on supremal divergence estimators for non-regular statistical models
- Smoothed noise contrastive mutual information neural estimation
- Computational complexity of kernel-based density-ratio estimation: a condition number analysis
- Machine learning with squared-loss mutual information
- Bias reduction and metric learning for nearest-neighbor estimation of Kullback-Leibler divergence
- Conditional density estimation with dimensionality reduction via squared-loss conditional entropy minimization
- Model Uncertainty and Correctability for Directed Graphical Models
- Semi-supervised learning of class balance under class-prior change by distribution matching
- Imitation learning as \(f\)-divergence minimization
- Probabilistic model validation for uncertain nonlinear systems
- Title not available (Why is that?)
- Density-difference estimation
- Data-driven spatiotemporal modeling for structural dynamics on irregular domains by stochastic dependency neural estimation
- Least-squares two-sample test
- Optimizing Variational Representations of Divergences and Accelerating Their Statistical Estimation
- Geometrical Insights for Implicit Generative Modeling
- Direct density derivative estimation
- Quantization and clustering with Bregman divergences
- Constructive setting for problems of density ratio estimation
- Improving bridge estimators via \(f\)-GAN
- Density-ratio matching under the Bregman divergence: a unified framework of density-ratio estimation
- Modern Bayesian experimental design
- Calibrated adversarial algorithms for generative modelling
- Sufficient dimension reduction via squared-loss mutual information estimation
- Stein variational gradient descent with learned direction
- Reducing the statistical error of generative adversarial networks using space-filling sampling
- GAT–GMM: Generative Adversarial Training for Gaussian Mixture Models
This page was built for publication: Estimating Divergence Functionals and the Likelihood Ratio by Convex Risk Minimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5281236)