Estimation of Kullback-Leibler losses for noisy recovery problems within the exponential family
From MaRDI portal
Publication:2408232
Abstract: We address the question of estimating Kullback-Leibler losses rather than squared losses in recovery problems where the noise is distributed within the exponential family. Inspired by Stein unbiased risk estimator (SURE), we exhibit conditions under which these losses can be unbiasedly estimated or estimated with a controlled bias. Simulations on parameter selection problems in applications to image denoising and variable selection with Gamma and Poisson noises illustrate the interest of Kullback-Leibler losses and the proposed estimators.
Recommendations
- On Kullback-Leibler loss and density estimation
- Variable selection using Kullback-Leibler divergence loss
- On Poisson signal estimation under Kullback-Leibler discrepancy and squared risk
- Optimal exponential bounds for aggregation of estimators for the Kullback-Leibler loss
- Estimation in a linear regression model under the Kullback-Leibler loss and its application to model selection
Cited in
(7)- Adaptive singular value shrinkage estimate for low rank tensor denoising
- Predictive risk estimation for the expectation maximization algorithm with Poisson data
- Minimax predictive density for sparse count data
- Generalized SURE for optimal shrinkage of singular values in low-rank matrix denoising
- Convergence of regularization methods with filter functions for a regularization parameter chosen with GSURE and mildly ill-posed inverse problems
- Low-rank matrix denoising for count data using unbiased Kullback-Leibler risk estimation
- Nearly minimax empirical Bayesian prediction of independent Poisson observables
This page was built for publication: Estimation of Kullback-Leibler losses for noisy recovery problems within the exponential family
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2408232)