Just relax: convex programming methods for identifying sparse signals in noise
From MaRDI portal
Publication:3547718
DOI10.1109/TIT.2005.864420zbMath1288.94025WikidataQ59750791 ScholiaQ59750791MaRDI QIDQ3547718
Publication date: 21 December 2008
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Convex programming (90C25) Signal theory (characterization, reconstruction, filtering, etc.) (94A12) Detection theory in information and communication theory (94A13)
Related Items
Sparse RBF surface representations, On proximal subgradient splitting method for minimizing the sum of two nonsmooth convex functions, Model recovery for multi-input signal-output nonlinear systems based on the compressed sensing recovery theory, Best subset selection via a modern optimization lens, Relationship between the optimal solutions of least squares regularized with \(\ell_{0}\)-norm and constrained by \(k\)-sparsity, Signal recovery under cumulative coherence, Minimizers of sparsity regularized Huber loss function, High-dimensional variable selection, High-dimensional analysis of semidefinite relaxations for sparse principal components, De-biasing the Lasso with degrees-of-freedom adjustment, Data-driven design of two degree-of-freedom nonlinear controllers: the \(\operatorname{D}^2\)-IBC approach, Non-convex sparse regularisation, Matrix sparsification and the sparse null space problem, Bias versus non-convexity in compressed sensing, Regularity properties of non-negative sparsity sets, On stepwise pattern recovery of the fused Lasso, Gradient methods for minimizing composite functions, Proximal methods for reweighted \(l_Q\)-regularization of sparse signal recovery, Recovery of high-dimensional sparse signals via \(\ell_1\)-minimization, ParNes: A rapidly convergent algorithm for accurate recovery of sparse and approximately sparse signals, Leading impulse response identification via the elastic net criterion, Approximation accuracy, gradient methods, and error bound for structured convex optimization, A primal dual active set with continuation algorithm for the \(\ell^0\)-regularized optimization problem, Dualization of signal recovery problems, Optimal dual certificates for noise robustness bounds in compressive sensing, Beyond canonical dc-optimization: the single reverse polar problem, A smoothing method for sparse optimization over convex sets, Model-based multiple rigid object detection and registration in unstructured range data, Iterative identification for multiple-input systems with time-delays based on greedy pursuit and auxiliary model, A simple homotopy proximal mapping algorithm for compressive sensing, Exact optimization for the \(\ell ^{1}\)-compressive sensing problem using a modified Dantzig-Wolfe method, Support union recovery in high-dimensional multivariate regression, The residual method for regularizing ill-posed problems, Autoregressive process modeling via the Lasso procedure, Sharp support recovery from noisy random measurements by \(\ell_1\)-minimization, Two are better than one: fundamental parameters of frame coherence, On verifiable sufficient conditions for sparse signal recovery via \(\ell_{1}\) minimization, Direct data domain STAP using sparse representation of clutter spectrum, Registration-based compensation using sparse representation in conformal-array STAP, New insights on the optimality conditions of the \(\ell_2-\ell_0\) minimization problem, High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence, Iterative identification for multivariable systems with time-delays based on basis pursuit de-noising and auxiliary model, Compressed sensing with structured sparsity and structured acquisition, Solve exactly an under determined linear system by minimizing least squares regularized with an \(\ell_0\) penalty, TV+TV regularization with nonconvex sparseness-inducing penalty for image restoration, Consistency of \(\ell_1\) recovery from noisy deterministic measurements, On the conditioning of random subdictionaries, A necessary and sufficient condition for exact sparse recovery by \(\ell_1\) minimization, Inferring stable genetic networks from steady-state data, A CS recovery algorithm for model and time delay identification of MISO-FIR systems, Control of MIMO nonlinear systems: a data-driven model inversion approach, Optimal identification experiment design for LPV systems using the local approach, A modified greedy analysis pursuit algorithm for the cosparse analysis model, Sparse signal recovery using a new class of random matrices, IDENT: identifying differential equations with numerical time evolution, Sparsity- and continuity-promoting seismic image recovery with curvelet frames, Limited-complexity controller tuning: a set membership data-driven approach, Rodeo: Sparse, greedy nonparametric regression, Homogeneous penalizers and constraints in convex image restoration, An evaluation of the sparsity degree for sparse recovery with deterministic measurement matrices, Tightness of the maximum likelihood semidefinite relaxation for angular synchronization, Equivalent Lipschitz surrogates for zero-norm and rank optimization problems, Stable restoration and separation of approximately sparse signals, Spectral dynamics and regularization of incompletely and irregularly measured data, Regularized learning in Banach spaces as an optimization problem: representer theorems, Convergence of fixed-point continuation algorithms for matrix rank minimization, A coordinate gradient descent method for \(\ell_{1}\)-regularized convex minimization, A performance guarantee for orthogonal matching pursuit using mutual coherence, Fixed point and Bregman iterative methods for matrix rank minimization, A semidefinite programming study of the Elfving theorem, Sparse approximate solution of partial differential equations, Iterative reweighted noninteger norm regularizing SVM for gene expression data classification, High-dimensional Ising model selection using \(\ell _{1}\)-regularized logistic regression, Adaptive algorithms for sparse system identification, A numerical exploration of compressed sampling recovery, Nearly unbiased variable selection under minimax concave penalty, Capped \(\ell_p\) approximations for the composite \(\ell_0\) regularization problem, A unified approach to model selection and sparse recovery using regularized least squares, When do stepwise algorithms meet subset selection criteria?, Sorted concave penalized regression, Lasso-type recovery of sparse representations for high-dimensional data, Iterative thresholding for sparse approximations, Atoms of all channels, unite! Average case analysis of multi-channel sparse recovery using greedy algorithms, Enhancing sparsity by reweighted \(\ell _{1}\) minimization, When is there a representer theorem? Reflexive Banach spaces, Relaxed maximum a posteriori fault identification, Second-order Stein: SURE for SURE and other applications in high-dimensional inference, Average performance of the approximation in a dictionary using an \(\ell _0\) objective, Accelerated iterative hard thresholding algorithm for \(l_0\) regularized regression problem, Two fast vector-wise update algorithms for orthogonal nonnegative matrix factorization with sparsity constraint, Sparse reconstruction with multiple Walsh matrices, Structured iterative hard thresholding with on- and off-grid applications, In defense of the indefensible: a very naïve approach to high-dimensional inference, New bounds for subset selection from conic relaxations, Theoretical guarantees for graph sparse coding, Mixed linear system estimation and identification, Alternating direction method of multipliers for solving dictionary learning models, Robust computation of linear models by convex relaxation, A modified Newton projection method for \(\ell _1\)-regularized least squares image deblurring, Locally sparse reconstruction using the \(\ell^{1,\infty}\)-norm, Solving the OSCAR and SLOPE Models Using a Semismooth Newton-Based Augmented Lagrangian Method, Proximal Splitting Methods in Signal Processing, Average Performance of the Sparsest Approximation Using a General Dictionary, Book Review: A mathematical introduction to compressive sensing, A component lasso, The Noise Collector for sparse recovery in high dimensions, Analysis of simultaneous inpainting and geometric separation based on sparse decomposition, When does OMP achieve exact recovery with continuous dictionaries?, Newton method for \(\ell_0\)-regularized optimization, Quadratic growth conditions and uniqueness of optimal solution to Lasso, SONIC: social network analysis with influencers and communities, Asymptotic Theory of \(\boldsymbol \ell _1\) -Regularized PDE Identification from a Single Noisy Trajectory, Model selection via reweighted partial sparse recovery, Weakly invariant norms: geometry of spheres in the space of skew-Hermitian matrices, Matrix-wise \(\ell_0\)-constrained sparse nonnegative least squares, Nonlinear system identification in Sobolev spaces, On the informativity of direct identification experiments in dynamical networks, Biconvex Clustering, Second order self-adaptive dynamical system for sparse signal reconstruction and applications to image recovery, Convergence of an asynchronous block-coordinate forward-backward algorithm for convex composite optimization, When Ramanujan meets time-frequency analysis in complicated time series analysis, Stable Recovery of Sparsely Corrupted Signals Through Justice Pursuit De-Noising, Subset Selection and the Cone of Factor-Width-k Matrices, On optimal solutions of the constrained ℓ 0 regularization and its penalty problem, Disparity and optical flow partitioning using extended Potts priors, Proximal Mapping for Symmetric Penalty and Sparsity, On FISTA with a relative error rule, Foundations of Gauge and Perspective Duality, Improved Sparse Coding Under the Influence of Perceptual Attention, Independently Interpretable Lasso for Generalized Linear Models, Inferring Sparse Preference Lists from Partial Information, On Reconstructing Functions from Binary Measurements, Unnamed Item, An adaptive inverse scale space method for compressed sensing, Towards a Mathematical Theory of Super‐resolution, Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models, Unnamed Item, Randomized pick-freeze for sparse Sobol indices estimation in high dimension, Linearized Bregman iterations for compressed sensing, Duality and Convex Programming, On the Convergence of the SINDy Algorithm, Resolution Analysis of Imaging with $\ell_1$ Optimization, Low Complexity Regularization of Linear Inverse Problems, Necessary and sufficient conditions for linear convergence of ℓ1-regularization, Structured sparsity through convex optimization, A general theory of concave regularization for high-dimensional sparse estimation problems, Understanding large text corpora via sparse machine learning, Imaging with highly incomplete and corrupted data, Sparsity promoting regularization for effective noise suppression in SPECT image reconstruction, Weak-Type Estimates for the Metaplectic Representation Restricted to the Shearing and Dilation Subgroup of $$SL(2,\mathbb {R})$$, Statistical Optimization in High Dimensions, Sparse set membership identification of nonlinear functions and application to fault detection, Sparse Bayesian Imaging of Solar Flares, Sparse identification of nonlinear dynamics for model predictive control in the low-data limit, LPV system identification for control using the local approach, A New Computational Method for the Sparsest Solutions to Systems of Linear Equations, Unnamed Item, Multi-Layer Sparse Coding: The Holistic Way, Solving Basis Pursuit