Estimating Divergence Functionals and the Likelihood Ratio by Convex Risk Minimization

From MaRDI portal
Publication:5281236

DOI10.1109/TIT.2010.2068870zbMath1366.62071arXiv0809.0853MaRDI QIDQ5281236

Michael I. Jordan, XuanLong Nguyen, Martin J. Wainwright

Publication date: 27 July 2017

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/0809.0853




Related Items (44)

GAT–GMM: Generative Adversarial Training for Gaussian Mixture ModelsNon-parametric estimation of mutual information through the entropy of the linkageA Monte Carlo approach to quantifying model error in Bayesian parameter estimationStatistical analysis of distance estimators with density differences and density ratiosModel Uncertainty and Correctability for Directed Graphical ModelsSufficient Dimension Reduction via Squared-Loss Mutual Information EstimationDensity-ratio matching under the Bregman divergence: a unified framework of density-ratio estimationSmoothed noise contrastive mutual information neural estimationData-driven spatiotemporal modeling for structural dynamics on irregular domains by stochastic dependency neural estimationA Deep Generative Approach to Conditional SamplingOn distributionally robust extreme value analysisStein variational gradient descent with learned directionGeometry of EM and related iterative algorithmsGeometrical Insights for Implicit Generative ModelingStatistical analysis of kernel-based least-squares density-ratio estimationLevel sets semimetrics for probability measures with applications in hypothesis testingAggregated tests based on supremal divergence estimators for non-regular statistical modelsComputational complexity of kernel-based density-ratio estimation: a condition number analysisOn the empirical estimation of integral probability metricsConvergence of latent mixing measures in finite and infinite mixture modelsLeast-squares two-sample testDensity-Difference EstimationConditional Density Estimation with Dimensionality Reduction via Squared-Loss Conditional Entropy MinimizationOnline Direct Density-Ratio Estimation Applied to Inlier-Based Outlier DetectionDirect Density Derivative EstimationDirect Learning of Sparse Changes in Markov Networks by Density Ratio EstimationProbabilistic model validation for uncertain nonlinear systemsNonparametric Estimation of Küllback-Leibler DivergenceSemi-supervised learning of class balance under class-prior change by distribution matchingBias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler DivergenceUnnamed ItemVariational Representations and Neural Network Estimation of Rényi DivergencesRobust Actuarial Risk AnalysisDirect density-ratio estimation with dimensionality reduction via least-squares hetero-distributional subspace searchQuantization and clustering with Bregman divergencesChange-point detection in time-series data by relative density-ratio estimationMachine learning with squared-loss mutual informationImitation Learning as f-Divergence MinimizationConstructive setting for problems of density ratio estimationRelative Density-Ratio Estimation for Robust Distribution ComparisonImproving bridge estimators via \(f\)-GANUnnamed ItemSolving Inverse Stochastic Problems from Discrete Particle Observations Using the Fokker--Planck Equation and Physics-Informed Neural NetworksFormulation and properties of a divergence used to compare probability measures without absolute continuity




This page was built for publication: Estimating Divergence Functionals and the Likelihood Ratio by Convex Risk Minimization