Adapting to unknown noise level in sparse deconvolution
DOI10.1093/IMAIAI/IAW024zbMATH Open1383.62186arXiv1606.04760OpenAlexW2962682711MaRDI QIDQ4603711FDOQ4603711
Authors: Claire Boyer, Yohann De Castro, Joseph Salmon
Publication date: 19 February 2018
Published in: Information and Inference: A Journal of the IMA (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1606.04760
Recommendations
- Sparse deconvolution using adaptive mixed-Gaussian models
- Adaptive de-noising of low SNR signals
- Deconvolution under Poisson noise using exact data fidelity and synthesis or analysis sparsity priors
- scientific article; zbMATH DE number 1742181
- Adaptive Image Denoising by Mixture Adaptation
- Estimation of Signal-Dependent Noise Level Function in Transform Domain via a Sparse Recovery Model
- From Bernoulli–Gaussian Deconvolution to Sparse Signal Restoration
- Adaptive Superresolution in Deconvolution of Sparse Peaks
model selectionsparsitydeconvolutioninverse problemsconvex regularizationRice methodsquare-root Lassoconcomitant Beurling Lassoscaled-Lasso
Point estimation (62F10) Ridge regression; shrinkage estimators (Lasso) (62J07) Signal theory (characterization, reconstruction, filtering, etc.) (94A12)
Cites Work
- Title not available (Why is that?)
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- A study of error variance estimation in Lasso regression
- Atomic Decomposition by Basis Pursuit
- Convex analysis and monotone operator theory in Hilbert spaces
- Robust Statistics
- Estimation and testing under sparsity. École d'Été de Probabilités de Saint-Flour XLV -- 2015
- Scaled sparse linear regression
- \(\ell_{1}\)-penalization for mixture regression models
- Adaptive estimation of a quadratic functional by model selection.
- On the prediction performance of the Lasso
- Comments on: \(\ell_{1}\)-penalization for mixture regression models
- Graph implementations for nonsmooth convex programs
- Level Sets and Extrema of Random Processes and Fields
- \(L_1\)-penalization in functional linear regression with subgaussian design
- Near Minimax Line Spectral Estimation
- Super-resolution from noisy data
- Towards a Mathematical Theory of Super‐resolution
- Near-ideal model selection by \(\ell _{1}\) minimization
- Non-uniform spline recovery from small degree polynomial approximation
- Exact recovery of non-uniform splines from the projection onto spaces of algebraic polynomials
- Spike detection from inaccurate samplings
- Robust recovery of stream of pulses using convex optimization
- Exact reconstruction using Beurling minimal extrapolation
- Super-Resolution on the Sphere Using Convex Optimization
- Super-resolution of point sources via convex programming
- Compressed Sensing Off the Grid
- Positive trigonometric polynomials and signal processing applications
- Exact support recovery for sparse spikes deconvolution
- Robust Regression and Lasso
- Inverse problems in spaces of measures
- Prediction and Discovery
Cited In (8)
- The MLE is a reliable source: sharp performance guarantees for localization problems
- Sampling the Fourier transform along radial lines
- Off-the-grid prediction and testing for linear combination of translated features
- Localization of point scatterers via sparse optimization on measures
- Simultaneous off-the-grid learning of mixtures issued from a continuous dictionary
- Prediction bounds for higher order total variation regularized least squares
- The basins of attraction of the global minimizers of the non-convex sparse spike estimation problem
- Structural, Syntactic, and Statistical Pattern Recognition
Uses Software
This page was built for publication: Adapting to unknown noise level in sparse deconvolution
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4603711)