Just relax: convex programming methods for identifying sparse signals in noise
From MaRDI portal
Publication:3547718
Recommendations
- Algorithms for simultaneous sparse approximation. II: Convex relaxation
- Templates for convex cone problems with applications to sparse signal recovery
- Stable signal recovery from incomplete and inaccurate measurements
- On sparse reconstruction from Fourier and Gaussian measurements
- Stable recovery of sparse overcomplete representations in the presence of noise
Cited in
(only showing first 100 items - show all)- Linearized Bregman iterations for compressed sensing
- Approximation accuracy, gradient methods, and error bound for structured convex optimization
- A primal dual active set with continuation algorithm for the \(\ell^0\)-regularized optimization problem
- On the conditioning of random subdictionaries
- Sparsity- and continuity-promoting seismic image recovery with curvelet frames
- ParNes: A rapidly convergent algorithm for accurate recovery of sparse and approximately sparse signals
- Homogeneous penalizers and constraints in convex image restoration
- Regularity properties of non-negative sparsity sets
- A new computational method for the sparsest solutions to systems of linear equations
- On the convergence of the SINDy algorithm
- An adaptive inverse scale space method for compressed sensing
- Autoregressive process modeling via the Lasso procedure
- Confidence intervals for low dimensional parameters in high dimensional linear models
- Sharp support recovery from noisy random measurements by \(\ell_1\)-minimization
- On verifiable sufficient conditions for sparse signal recovery via \(\ell_{1}\) minimization
- Relationship between the optimal solutions of least squares regularized with \(\ell_{0}\)-norm and constrained by \(k\)-sparsity
- Algorithms for simultaneous sparse approximation. I: Greedy pursuit
- Non-convex sparse regularisation
- Optimal identification experiment design for LPV systems using the local approach
- Accelerated iterative hard thresholding algorithm for \(l_0\) regularized regression problem
- A semidefinite programming study of the Elfving theorem
- Exact optimization for the \(\ell ^{1}\)-compressive sensing problem using a modified Dantzig-Wolfe method
- Model-based multiple rigid object detection and registration in unstructured range data
- Dualization of signal recovery problems
- A coordinate gradient descent method for \(\ell_{1}\)-regularized convex minimization
- Iterative thresholding for sparse approximations
- Best subset selection via a modern optimization lens
- Solving basis pursuit: heuristic optimality check and solver comparison
- On the informativity of direct identification experiments in dynamical networks
- Lasso-type recovery of sparse representations for high-dimensional data
- Gradient methods for minimizing composite functions
- Robust computation of linear models by convex relaxation
- A general theory of concave regularization for high-dimensional sparse estimation problems
- Towards a Mathematical Theory of Super‐resolution
- Inferring stable genetic networks from steady-state data
- Nearly unbiased variable selection under minimax concave penalty
- Proximal splitting methods in signal processing
- Convolutional neural networks analyzed via convolutional sparse coding
- Recovery of high-dimensional sparse signals via \(\ell_1\)-minimization
- Mixed linear system estimation and identification
- Disparity and optical flow partitioning using extended Potts priors
- Two are better than one: fundamental parameters of frame coherence
- Rodeo: Sparse, greedy nonparametric regression
- Adaptive algorithms for sparse system identification
- Structured sparsity through convex optimization
- Locally sparse reconstruction using the \(\ell^{1,\infty}\)-norm
- Enhancing sparsity by reweighted \(\ell _{1}\) minimization
- A numerical exploration of compressed sampling recovery
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- Regularized learning in Banach spaces as an optimization problem: representer theorems
- High-dimensional variable selection
- Data-driven design of two degree-of-freedom nonlinear controllers: the \(\operatorname{D}^2\)-IBC approach
- Necessary and sufficient conditions for linear convergence of \(\ell^1\)-regularization
- Support union recovery in high-dimensional multivariate regression
- Templates for convex cone problems with applications to sparse signal recovery
- Convergence of fixed-point continuation algorithms for matrix rank minimization
- A modified greedy analysis pursuit algorithm for the cosparse analysis model
- Sparse reconstruction with multiple Walsh matrices
- Sparse signal recovery using a new class of random matrices
- Fixed point and Bregman iterative methods for matrix rank minimization
- High-dimensional analysis of semidefinite relaxations for sparse principal components
- High-dimensional Ising model selection using \(\ell _{1}\)-regularized logistic regression
- Tightness of the maximum likelihood semidefinite relaxation for angular synchronization
- Matrix-wise \(\ell_0\)-constrained sparse nonnegative least squares
- The residual method for regularizing ill-posed problems
- On stepwise pattern recovery of the fused Lasso
- Algorithms for simultaneous sparse approximation. II: Convex relaxation
- A unified approach to model selection and sparse recovery using regularized least squares
- Direct data domain STAP using sparse representation of clutter spectrum
- Registration-based compensation using sparse representation in conformal-array STAP
- A necessary and sufficient condition for exact sparse recovery by \(\ell_1\) minimization
- A modified Newton projection method for \(\ell _1\)-regularized least squares image deblurring
- Robust sparse recovery via a novel convex model
- Proximal mapping for symmetric penalty and sparsity
- Average Performance of the Sparsest Approximation Using a General Dictionary
- Statistical optimization in high dimensions
- In defense of the indefensible: a very naïve approach to high-dimensional inference
- When do stepwise algorithms meet subset selection criteria?
- When is there a representer theorem? Reflexive Banach spaces
- A performance guarantee for orthogonal matching pursuit using mutual coherence
- SONIC: social network analysis with influencers and communities
- Minimizers of sparsity regularized Huber loss function
- Multi-layer sparse coding: the holistic way
- Piecewise-polynomial signal segmentation using convex optimization.
- Theoretical guarantees for graph sparse coding
- Optimal dual certificates for noise robustness bounds in compressive sensing
- Book Review: A mathematical introduction to compressive sensing
- TV+TV regularization with nonconvex sparseness-inducing penalty for image restoration
- Iterative identification for multiple-input systems with time-delays based on greedy pursuit and auxiliary model
- Sparse set membership identification of nonlinear functions and application to fault detection
- Beyond canonical dc-optimization: the single reverse polar problem
- Stable restoration and separation of approximately sparse signals
- Sparse identification of nonlinear dynamics for model predictive control in the low-data limit
- Signal recovery under cumulative coherence
- Second-order Stein: SURE for SURE and other applications in high-dimensional inference
- On proximal subgradient splitting method for minimizing the sum of two nonsmooth convex functions
- Consistency of \(\ell_1\) recovery from noisy deterministic measurements
- Spectral dynamics and regularization of incompletely and irregularly measured data
- Compressed sensing with structured sparsity and structured acquisition
- Foundations of gauge and perspective duality
This page was built for publication: Just relax: convex programming methods for identifying sparse signals in noise
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3547718)