Atomic Decomposition by Basis Pursuit
From MaRDI portal
Publication:4210402
DOI10.1137/S1064827596304010zbMath0919.94002MaRDI QIDQ4210402
Michael A. Saunders, Scott Shaobing Chen, David L. Donoho
Publication date: 21 September 1998
Published in: SIAM Journal on Scientific Computing (Search for Journal in Brave)
waveletsdenoisingtime-frequency analysistime-scale analysismatching pursuitmultiscale edges\(\ell^1\) norm optimizationcosine packetsinterior-point methods for linear programmingovercomplete signal representation
Numerical mathematical programming methods (65K05) Signal theory (characterization, reconstruction, filtering, etc.) (94A12) Algorithms for approximation of functions (65D15) Approximation by arbitrary linear expressions (41A45)
Related Items
Tseng type methods for solving inclusion problems and its applications, On a combinatorial framework for fault characterization, Modified hybrid decomposition of the augmented Lagrangian method with larger step size for three-block separable convex programming, A frame based shrinkage procedure for fast oscillating functions, Signal recovery under cumulative coherence, Minimizers of sparsity regularized Huber loss function, Local search methods for \(\ell_{1}\)-minimization in frame based signal compression, Functional linear regression that's interpretable, Near-ideal model selection by \(\ell _{1}\) minimization, Fast \(\ell _{1}\) minimization by iterative thresholding for multidimensional NMR spectroscopy, Accelerating gradient projection methods for \(\ell _1\)-constrained signal recovery by steplength selection rules, Sparse approximation of images inspired from the functional architecture of the primary visual areas, Robust sparse component analysis based on a generalized Hough transform, Improved bounds for sparse recovery from subsampled random convolutions, Variational methods for normal integration, Solution of linear ill-posed problems by model selection and aggregation, A primal-dual homotopy algorithm for \(\ell _{1}\)-minimization with \(\ell _{\infty }\)-constraints, \(l_1\)-\(l_2\) regularization of split feasibility problems, Note on the modified relaxation CQ algorithm for the split feasibility problem, Instrumental variable-based OMP identification algorithm for Hammerstein systems, Random sampling of sparse trigonometric polynomials, Multi-frame representations in linear inverse problems with mixed multi-constraints, Landmark recognition with sparse representation classification and extreme learning machine, Mixed \(\ell_2\) and \(\ell_1\)-norm regularization for adaptive detrending with ARMA modeling, On the sign consistency of the Lasso for the high-dimensional Cox model, Prior model identification during subsurface flow data integration with adaptive sparse representation techniques, Component selection and smoothing in multivariate nonparametric regression, Alternating direction method of multipliers for separable convex optimization of real functions in complex variables, Mixed Hölder matrix discovery via wavelet shrinkage and Calderón-Zygmund decompositions, PROMP: a sparse recovery approach to lattice-valued signals, Recovery of block sparse signals under the conditions on block RIC and ROC by BOMP and BOMMP, Relaxed sparse eigenvalue conditions for sparse estimation via non-convex regularized regression, Estimating the probability law of the codelength as a function of the approximation error in image compression, Exact support recovery for sparse spikes deconvolution, RBF-network based sparse signal recovery algorithm for compressed sensing reconstruction, A strong convergence result involving an inertial forward-backward algorithm for monotone inclusions, A primal dual active set with continuation algorithm for the \(\ell^0\)-regularized optimization problem, An alternating direction and projection algorithm for structure-enforced matrix factorization, A generalized elastic net regularization with smoothed \(\ell _{q}\) penalty for sparse vector recovery, A proximal alternating linearization method for minimizing the sum of two convex functions, Adaptive projected gradient thresholding methods for constrained \(l_0\) problems, Optimal dual certificates for noise robustness bounds in compressive sensing, Sparse and low-rank matrix regularization for learning time-varying Markov networks, Extensions of Gauss quadrature via linear programming, Approximating sampled sinusoids and multiband signals using multiband modulated DPSS dictionaries, A hierarchical framework for recovery in compressive sensing, Sparse Bayesian representation in time-frequency domain, Weakly-supervised cross-domain dictionary learning for visual recognition, Analysis \(\ell_1\)-recovery with frames and Gaussian measurements, A Rice method proof of the null-space property over the Grassmannian, Learning active basis models by EM-type algorithms, Finding effective points by surrogate models with overcomplete bases, Kernel methods in machine learning, On the conditioning of random subdictionaries, The LOST algorithm: Finding lines and separating speech mixtures, A coordinate gradient descent method for nonsmooth separable minimization, Sparsity in time-frequency representations, Image denoising via solution paths, Sparse approximate solution of partial differential equations, Least angle and \(\ell _{1}\) penalized regression: a review, Nonparametric denoising of signals of unknown local structure. II: Nonparametric function recovery, L1Packv2: A Mathematica package for minimizing an \(\ell _{1}\)-penalized functional, CMB data analysis and sparsity, New methods for fitting multiple sinusoids from irregularly sampled data, Processing MUSE hyperspectral data: denoising, deconvolution and detection of astrophysical sources, Sparse recovery by non-convex optimization - instance optimality, On compressive sensing applied to radar, Penalized quantile regression for dynamic panel data, Regularization of geophysical ill-posed problems by iteratively re-weighted and refined least squares, When do stepwise algorithms meet subset selection criteria?, New augmented Lagrangian-based proximal point algorithm for convex optimization with equality constraints, Enhancing sparsity of Hermite polynomial expansions by iterative rotations, Spectrum estimation for large dimensional covariance matrices using random matrix theory, Sparse regression using mixed norms, Iterative thresholding for sparse approximations, Analysis of basis pursuit via capacity sets, Enhancing sparsity by reweighted \(\ell _{1}\) minimization, Robust penalized quantile regression estimation for panel data, New cyclic sparsity measures for deconvolution based on convex relaxation, Sparse recovery via differential inclusions, Fast global convergence of gradient methods for high-dimensional statistical recovery, Customized proximal point algorithms for linearly constrained convex minimization and saddle-point problems: a unified approach, A constrained optimization reformulation and a feasible descent direction method for \(L_{1/2}\) regularization, Radar target recognition using contourlet packet transform and neural network approach, Matrix-free interior point method for compressed sensing problems, K-hyperline clustering learning for sparse component analysis, Positive shrinkage, improved pretest and absolute penalty estimators in partially linear models, Parsimonious additive models, Support vector machines with adaptive \(L_q\) penalty, Elastic-net regularization in learning theory, Compression and denoising using \(l _{0}\)-norm, Random sampling of sparse trigonometric polynomials. II: Orthogonal matching pursuit versus basis pursuit, Sparsity and persistence: mixed norms provide simple signal models with dependent coefficients, Compressive sensing for subsurface imaging using ground penetrating radar, Wedgelets: Nearly minimax estimation of edges, Sparse solutions to underdetermined Kronecker product systems, Generalized ADMM with optimal indefinite proximal term for linearly constrained convex optimization, Optimization of frame representations for compressed sensing and Mercedes-Benz frame, On a simple derivation of the complementary matching pursuit, The Gaussian hare and the Laplacian tortoise: computability of squared-error versus absolute-error estimators. With comments by Ronald A. Thisted and M. R. Osborne and a rejoinder by the authors, Combining a relaxed EM algorithm with Occam's razor for Bayesian variable selection in high-dimensional regression, Computing and analyzing recoverable supports for sparse reconstruction, Simultaneous image fusion and demosaicing via compressive sensing, Optimal sampling points in reproducing kernel Hilbert spaces, New properties of forward-backward splitting and a practical proximal-descent algorithm, Best subset selection via a modern optimization lens, Data-driven atomic decomposition via frequency extraction of intrinsic mode functions, Inducing wavelets into random fields via generative boosting, Iterative methods for solving the multiple-sets split feasibility problem with splitting self-adaptive step size, Statistical consistency of coefficient-based conditional quantile regression, Phase retrieval from Gabor measurements, Nonlinear residual minimization by iteratively reweighted least squares, A proximal method for composite minimization, Projection algorithms for nonconvex minimization with application to sparse principal component analysis, A dual method for minimizing a nonsmooth objective over one smooth inequality constraint, Conjugate gradient acceleration of iteratively re-weighted least squares methods, Sparse high-dimensional FFT based on rank-1 lattice sampling, Revisiting compressed sensing: exploiting the efficiency of simplex and sparsification methods, Block coordinate proximal gradient methods with variable Bregman functions for nonsmooth separable optimization, A weighted \(\ell_1\)-minimization approach for sparse polynomial chaos expansions, Compressive sampling of polynomial chaos expansions: convergence analysis and sampling strategies, Gradient methods for minimizing composite functions, The essential ability of sparse reconstruction of different compressive sensing strategies, Compressive sensing SAR range compression with chirp scaling principle, Sparse microwave imaging: principles and applications, Compressed sensing SAR imaging based on sparse representation in fractional Fourier domain, Spectral intrinsic decomposition method for adaptive signal representation, Oracle inequalities for the lasso in the Cox model, Accelerated Bregman method for linearly constrained \(\ell _1-\ell _2\) minimization, ParNes: A rapidly convergent algorithm for accurate recovery of sparse and approximately sparse signals, Learning gradients on manifolds, Approximation accuracy, gradient methods, and error bound for structured convex optimization, Theory of compressive sensing via \(\ell_1\)-minimization: a non-RIP analysis and extensions, The Gelfand widths of \(\ell_p\)-balls for \(0 < p \leq 1\), Feature selection when there are many influential features, Remote sensing via \(\ell_1\)-minimization, Sharp recovery bounds for convex demixing, with applications, Restricted isometries for partial random circulant matrices, Multi-scale geometric methods for data sets. II: Geometric multi-resolution analysis, Sparse conjugate directions pursuit with application to fixed-size kernel models, Sparse canonical correlation analysis, Sparse Legendre expansions via \(\ell_1\)-minimization, On the representation of functions with Gaussian wave packets, Group coordinate descent algorithms for nonconvex penalized regression, Sharp support recovery from noisy random measurements by \(\ell_1\)-minimization, Two are better than one: fundamental parameters of frame coherence, Matrix-free interior point method, Coordinate ascent for penalized semiparametric regression on high-dimensional panel count data, The solution path of the generalized lasso, A unified primal-dual algorithm framework based on Bregman iteration, Exact reconstruction using Beurling minimal extrapolation, Convex feasibility modeling and projection methods for sparse signal recovery, Degrees of freedom in lasso problems, UPS delivers optimal phase diagram in high-dimensional variable selection, Lagrangian-penalization algorithm for constrained optimization and variational inequalities, Proximal methods for the latent group lasso penalty, Discussion: Time-threshold maps: using information from wavelet reconstructions with all threshold values simultaneously, Rejoinder: Time-threshold maps: using information from wavelet reconstructions with all threshold values simultaneously, The sparse Laplacian shrinkage estimator for high-dimensional regression, 2DPCA with L1-norm for simultaneously robust and sparse modelling, An infeasible-point subgradient method using adaptive approximate projections, Iterative reweighted minimization methods for \(l_p\) regularized unconstrained nonlinear programming, First-order optimality condition of basis pursuit denoise problem, Bias-variance trade-off for prequential model list selection, Robust face recognition via block sparse Bayesian learning, Sparse time-frequency representation of nonlinear and nonstationary data, On higher order isotropy conditions and lower bounds for sparse quadratic forms, High-dimensional inference in misspecified linear models, A Barzilai-Borwein type method for minimizing composite functions, Compressive sensing using chaotic sequence based on Chebyshev map, Disjoint sparsity for signal separation and applications to hybrid inverse problems in medical imaging, Bayesian signal detection with compressed measurements, On the solution uniqueness characterization in the L1 norm and polyhedral gauge recovery, A modified greedy analysis pursuit algorithm for the cosparse analysis model, Support recovery for sparse super-resolution of positive measures, Sparse decomposition by iterating Lipschitzian-type mappings, Decomposition into low-rank plus additive matrices for background/foreground separation: a review for a comparative evaluation with a large-scale dataset, Compressed sensing from a harmonic analysis point of view, Data based identification and prediction of nonlinear and complex dynamical systems, Time-invariant Radon transform by generalized Fourier slice theorem, Interpretable sparse SIR for functional data, High-dimensional regression with noisy and missing data: provable guarantees with nonconvexity, A coordinate gradient descent method for \(\ell_{1}\)-regularized convex minimization, Learning the coordinate gradients, Surveying and comparing simultaneous sparse approximation (or group-lasso) algorithms, Non-smooth equations based method for \(\ell_1\)-norm problems with applications to compressed sensing, A non-adapted sparse approximation of PDEs with stochastic inputs, Compressive wave computation, Regularized vector field learning with sparse approximation for mismatch removal, Estimation of (near) low-rank matrices with noise and high-dimensional scaling, Adaptive algorithms for sparse system identification, A Bayesian lasso via reversible-jump MCMC, Projected Landweber iteration for matrix completion, Stochastic expansions using continuous dictionaries: Lévy adaptive regression kernels, Estimator selection in the Gaussian setting, A unified approach to model selection and sparse recovery using regularized least squares, A modified local quadratic approximation algorithm for penalized optimization problems, A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations, An alternating direction method with continuation for nonconvex low rank minimization, Strong convergence of a modified proximal algorithm for solving the lasso, A proximal alternating direction method for \(\ell_{2,1}\)-norm least squares problem in multi-task feature learning, Global sensitivity analysis based on high-dimensional sparse surrogate construction, Entropic Regularization of the ℓ 0 Function, Large sparse signal recovery by conjugate gradient algorithm based on smoothing technique, Smoothing projected Barzilai-Borwein method for constrained non-Lipschitz optimization, Splitting matching pursuit method for reconstructing sparse signal in compressed sensing, Image compression through a projection onto a polyhedral set, Feature space perspectives for learning the kernel, Synchrotron radiation-based \(l_1\)-norm regularization on micro-CT imaging in shale structure analysis, A sparse recovery method for DOA estimation based on the sample covariance vectors, Local linear convergence of a primal-dual algorithm for the augmented convex models, Suprema of Chaos Processes and the Restricted Isometry Property, Compressed sensing for finite-valued signals, Adaptive step-size matching pursuit algorithm for practical sparse reconstruction, When does OMP achieve exact recovery with continuous dictionaries?, \(\ell^1\)-analysis minimization and generalized (co-)sparsity: when does recovery succeed?, A primal Douglas-Rachford splitting method for the constrained minimization problem in compressive sensing, The degrees of freedom of partly smooth regularizers, Globally sparse and locally dense signal recovery for compressed sensing, Two results on slime mold computations, RNLp: mixing nonlocal and TV-Lp methods to remove impulse noise from images, Matrix-Free Convex Optimization Modeling, An Augmented Lagrangian Method for Non-Lipschitz Nonconvex Programming, A simple homotopy proximal mapping algorithm for compressive sensing, Basis adaptive sample efficient polynomial chaos (BASE-PC), Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators, Phase transitions in semidefinite relaxations, Support union recovery in high-dimensional multivariate regression, A novel algorithm for approximating common solution of a system of monotone inclusion problems and common fixed point problem, A consistent algorithm to solve Lasso, elastic-net and Tikhonov regularization, Adaptive iterative hard thresholding for low-rank matrix recovery and rank-one measurements, Compressive Sensing, On the sparsity of Lasso minimizers in sparse data recovery, A relaxed-PPA contraction method for sparse signal recovery, On the uniform convergence of empirical norms and inner products, with application to causal inference, Image decomposition using optimally sparse representations and a variational approach, Searching-and-averaging method of underdetermined blind speech signal separation in time domain, Beyond sparsity: recovering structured representations by \({\ell}^1\) minimization and greedy algorithms, Pathwise coordinate optimization, Gaussian averages of interpolated bodies and applications to approximate reconstruction, The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder)., A bilinear algorithm for sparse representations, Ways to sparse representation: An overview, A numerical exploration of compressed sampling recovery, Feature selection for high-dimensional data, Boosting for high-dimensional linear models, Structured sparsity promoting functions, On a monotone scheme for nonconvex nonsmooth optimization with applications to fracture mechanics, Basis pursuit for frequency-domain identification, Duality and Convex Programming, Starlet Transform in Astronomical Data Processing, A Compressive Sensing Based Analysis of Anomalies in Generalized Linear Models, Nesterov's algorithm solving dual formulation for compressed sensing, Reprint of ``Nesterov's algorithm solving dual formulation for compressed sensing, Far Field Splitting by Iteratively Reweighted $\ell^1$ Minimization, Support vector machines: a nonlinear modelling and control perspective, Estimation in High Dimensions: A Geometric Perspective, Low Complexity Regularization of Linear Inverse Problems, The stochastic properties of \(\ell^1\)-regularized spherical Gaussian fields, On the Minimization Over Sparse Symmetric Sets: Projections, Optimality Conditions, and Algorithms, Leave-one-out cross-validation is risk consistent for Lasso, On the differences between \(L_2\) boosting and the Lasso, Nonconvex sorted \(\ell_1\) minimization for sparse approximation, ROS regression: integrating regularization with optimal scaling regression, OR Forum—An Algorithmic Approach to Linear Regression, An efficient privacy-preserving compressive data gathering scheme in WSNs, DC Approximation Approach for ℓ0-minimization in Compressed Sensing, Model recovery for Hammerstein systems using the auxiliary model based orthogonal matching pursuit method, Exact reconstruction of the nonnegative measures using model sets, A coordinate descent homotopy method for linearly constrained nonsmooth convex minimization, Projection onto a Polyhedron that Exploits Sparsity, Sharp oracle inequalities for low-complexity priors, Identification of interactions in fractional-order systems with high dimensions, Linear Convergence of Descent Methods for the Unconstrained Minimization of Restricted Strongly Convex Functions, \(h\)-\(p\) adaptive model based approximation of moment free sensitivity indices, A new relaxed CQ algorithm for solving split feasibility problems in Hilbert spaces and its applications, Splitting augmented Lagrangian method for optimization problems with a cardinality constraint and semicontinuous variables, Critical behavior and universality classes for an algorithmic phase transition in sparse reconstruction, A simple Gaussian measurement bound for exact recovery of block-sparse signals, Designing Gabor windows using convex optimization, A Network of Spiking Neurons for Computing Sparse Representations in an Energy-Efficient Way, Sparse Representation of Signals in Hardy Space, Single image super-resolution by non-linear sparse representation and support vector regression, A modified primal-dual method with applications to some sparse recovery problems, Region-based image-fusion framework for compressive imaging, Weaker regularity conditions and sparse recovery in high-dimensional regression, A smoothed \(l_0\)-norm and \(l_1\)-norm regularization algorithm for computed tomography, The Adaptive Gril Estimator with a Diverging Number of Parameters, Lasso penalized semiparametric regression on high-dimensional recurrent event data via coordinate descent, Alternating direction method of multipliers for solving dictionary learning models, Simultaneous cartoon and texture image inpainting using morphological component analysis (MCA), Theory and applications of compressed sensing, An Overview of Computational Sparse Models and Their Applications in Artificial Intelligence, Error analysis of reweighted \(l_1\) greedy algorithm for noisy reconstruction, Primal–dual first-order methods for a class of cone programming, On sparse representation of analytic signal in Hardy space, PHASE TRANSITIONS IN ERROR CORRECTING AND COMPRESSED SENSING BY ℓ1 LINEAR PROGRAMMING, An adaptive accelerated proximal gradient method and its homotopy continuation for sparse optimization, A modified Newton projection method for \(\ell _1\)-regularized least squares image deblurring, Sparse proximal support vector machine with a specialized interior-point method, Exact penalization for cardinality and rank-constrained optimization problems via partial regularization, A regularized interior-point method for constrained linear least squares, Safe Rules for the Identification of Zeros in the Solutions of the SLOPE Problem, Nonlinear Extraction of Independent Components of Natural Images Using Radial Gaussianization, Error Analysis of Coefficient-Based Regularized Algorithm for Density-Level Detection, FRIST—flipping and rotation invariant sparsifying transform learning and applications, Difference-of-Convex Learning: Directional Stationarity, Optimality, and Sparsity, A smoothing inertial neural network for sparse signal reconstruction with noise measurements via \(L_p-L_1\) minimization, Framework for segmented threshold \(\ell_0\) gradient approximation based network for sparse signal recovery, Heuristics for Finding Sparse Solutions of Linear Inequalities, Block-sparse recovery and rank minimization using a weighted \(l_p-l_q\) model, Time‐optimal L1/L2 norms optimal control for linear time‐invariant systems, Double inertial parameters forward-backward splitting method: Applications to compressed sensing, image processing, and SCAD penalty problems, Study on \(L_1\) over \(L_2\) Minimization for nonnegative signal recovery, Support Recovery and Parameter Identification of Multivariate ARMA Systems with Exogenous Inputs, Faster randomized block sparse Kaczmarz by averaging, Convergence of Extrapolated Dynamic StringAveraging Cutter Methods and Applications, A wonderful triangle in compressed sensing, Heavy-ball-based optimal thresholding algorithms for sparse linear inverse problems, Sparse control for continuous‐time systems, Convergence analysis of modified inertial forward–backward splitting scheme with applications, Sampling rates for \(\ell^1\)-synthesis, Estimating a Potential Without the Agony of the Partition Function, LASSO Reloaded: A Variational Analysis Perspective with Applications to Compressed Sensing, Tsallis entropy based uncertainty relations on sparse representation for vector and matrix signals, A non-convex piecewise quadratic approximation of \(\ell_0\) regularization: theory and accelerated algorithm, Concentration of measure bounds for matrix-variate data with missing values, A Lorentzian-\(\ell_p\) norm regularization based algorithm for recovering sparse signals in two types of impulsive noise, Local linear convergence of proximal coordinate descent algorithm, Scaled proximal gradient methods for sparse optimization problems, High-dimensional composite quantile regression: optimal statistical guarantees and fast algorithms, Extrapolated simultaneous block‐iterative cutter methods and applications, Identifying stochastic governing equations from data of the most probable transition trajectories, Inexact generalized ADMM with relative error criteria for linearly constrained convex optimization problems, A Path-Based Approach to Constrained Sparse Optimization, Sparse spikes super-resolution on thin grids II: the continuous basis pursuit, Multiple Spectral Kernel Learning and a Gaussian Complexity Computation, Dimensionality-Dependent Generalization Bounds for k-Dimensional Coding Schemes, Margin Error Bounds for Support Vector Machines on Reproducing Kernel Banach Spaces, MODEL SELECTION AND INFERENCE: FACTS AND FICTION, Non‐negative matrix factorization and its application in blind sparse source separation with less sensors than sources, Consistent parameter estimation for Lasso and approximate message passing, Unsupervised learning of compositional sparse code for natural image representation, An introduction to continuous optimization for imaging, Influences of preconditioning on the mutual coherence and the restricted isometry property of Gaussian/Bernoulli measurement matrices, Adaptive surface-related multiple subtraction using sparse norm minimization method, Structured sparsity through convex optimization, High-dimensional regression with unknown variance, A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers, Greedy approximate projection for magnetic resonance fingerprinting with partial volumes, Analysis of Quantization Noise Spectrum in Signal Reconstruction, Characterization of ℓ1 minimizer in one-bit compressed sensing, Perturbation Analysis of Orthogonal Least Squares, Extraction of Intrawave Signals Using the Sparse Time-Frequency Representation Method, On the Compressive Spectral Method, Minimization of $\ell_{1-2}$ for Compressed Sensing, Stochastic Collocation Algorithms Using $l_1$-Minimization for Bayesian Solution of Inverse Problems, The Restricted Isometry Property of Subsampled Fourier Matrices, A New Computational Method for the Sparsest Solutions to Systems of Linear Equations, A primal-dual flow for affine constrained convex optimization, Discussion: ``A significance test for the lasso, Discussion: ``A significance test for the lasso, Discussion: ``A significance test for the lasso, Discussion: ``A significance test for the lasso, Discussion: ``A significance test for the lasso, Discussion of: Treelets -- an adaptive multi-scale basis for sparse unordered data, On the Uniqueness of Sparse Time-Frequency Representation of Multiscale Data, Pattern recovery and signal denoising by SLOPE when the design matrix is orthogonal, Bayesian hierarchical dictionary learning, Adaptive Geometric Multiscale Approximations for Intrinsically Low-dimensional Data, Solving the OSCAR and SLOPE Models Using a Semismooth Newton-Based Augmented Lagrangian Method, Seemingly Unrelated Ridge Regression in Semiparametric Models, Eventual linear convergence of the Douglas-Rachford iteration for basis pursuit, Estimation of block sparsity in compressive sensing, Robust Decoding from 1-Bit Compressive Sampling with Ordinary and Regularized Least Squares, A Least-Squares Method for Sparse Low Rank Approximation of Multivariate Functions, Monotone Smoothing Splines using General Linear Systems, Sparsity Based Methods for Overparameterized Variational Problems, Efficient Blind Compressed Sensing Using Sparsifying Transforms with Convergence Guarantees and Application to Magnetic Resonance Imaging, Sparse signal recovery by accelerated ℓq (0<q<1) thresholding algorithm, Book Review: A mathematical introduction to compressive sensing, Testable uniqueness conditions for empirical assessment of undersampling levels in total variation-regularized X-ray CT, Global optimization for sparse solution of least squares problems, Wasserstein Loss for Image Synthesis and Restoration, A proximal block minimization method of multipliers with a substitution procedure, A Survey of Compressed Sensing, Temporal Compressive Sensing for Video, Compressive Sensing in Acoustic Imaging, Quantization and Compressive Sensing, Structured Sparsity: Discrete and Convex Approaches, A component lasso, On Different Facets of Regularization Theory, Approximation Bounds for Some Sparse Kernel Regression Algorithms, Sensitivity Analysis for Mirror-Stratifiable Convex Functions, Discriminative Sparse Representations, Minimization of $L_1$ Over $L_2$ for Sparse Signal Recovery with Convergence Guarantee, Boosting of Image Denoising Algorithms, A Proximal Strictly Contractive Peaceman--Rachford Splitting Method for Convex Programming with Applications to Imaging, A Continuous Exact $\ell_0$ Penalty (CEL0) for Least Squares Regularized Problem, Nonconvex flexible sparsity regularization: theory and monotone numerical schemes, Analysis of Sparse Representation and Blind Source Separation, A hierarchical Bayesian perspective on majorization-minimization for non-convex sparse regression: application to M/EEG source imaging, Linear Time Dynamic Programming for Computing Breakpoints in the Regularization Path of Models Selected From a Finite Set, Monte Carlo Simulation for Lasso-Type Problems by Estimator Augmentation, A theoretical study of COmpRessed SolvING for advection-diffusion-reaction problems, Divided Differences, Falling Factorials, and Discrete Splines: Another Look at Trend Filtering and Related Problems, Sampling the Fourier Transform Along Radial Lines, Sparse Time-Frequency Decomposition for Multiple Signals with Same Frequencies, Unnamed Item, Model selection with low complexity priors, Adapting to unknown noise level in sparse deconvolution, A Highly Efficient Semismooth Newton Augmented Lagrangian Method for Solving Lasso Problems, Proximal Mapping for Symmetric Penalty and Sparsity, Efficient regularization with wavelet sparsity constraints in photoacoustic tomography, On Linguistic Variables and Sparse Representations, Constructing New Weighted ℓ1-Algorithms for the Sparsest Points of Polyhedral Sets, Online Decision Making with High-Dimensional Covariates, Stability analysis of a class of sparse optimization problems, STRUCTURE AND TEXTURE IMAGE INPAINTING USING SPARSE REPRESENTATIONS AND AN ITERATIVE CURVELET THRESHOLDING APPROACH, Projected Newton method for noise constrained ℓ p regularization, Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information, Musical audio analysis using sparse representations, On Debiasing Restoration Algorithms: Applications to Total-Variation and Nonlocal-Means, Unnamed Item, Non-convex Optimization via Strongly Convex Majorization-minimization, Model selection approaches for non-linear system identification: a review, Global Convergence Rate of Proximal Incremental Aggregated Gradient Methods, Greedy approach to sparse multi-path channel estimation using sensing dictionary, General alternative regularization methods for split equality common fixed-point problem, An Online Generalized Multiscale Discontinuous Galerkin Method (GMsDGM) for Flows in Heterogeneous Media, Iteratively Reweighted Group Lasso Based on Log-Composite Regularization, A BREGMAN ITERATIVE REGULARIZATION METHOD FOR WAVELET-BASED IMAGE DEBLURRING, Search for Sparse Active Inputs: A Review, A Tight Bound of Hard Thresholding, Smooth Blockwise Iterative Thresholding: A Smooth Fixed Point Estimator Based on the Likelihood’s Block Gradient, Model building with likelihood basis pursuit, Deterministic matrices matching the compressed sensing phase transitions of Gaussian random matrices, Penalized Triograms: Total Variation Regularization for Bivariate Smoothing, Iterative thresholding compressed sensing MRI based on contourlet transform, Unnamed Item, Unnamed Item, Fast Sparse Representation Based on Smoothed ℓ0 Norm, Dictionary Learning for L1-Exact Sparse Coding, Sparse Component Analysis in Presence of Noise Using an Iterative EM-MAP Algorithm, First- and Second-Order Methods for Online Convolutional Dictionary Learning, On Collaborative Compressive Sensing Systems: The Framework, Design, and Algorithm, Elastic-Net Regularization: Iterative Algorithms and Asymptotic Behavior of Solutions, SHRINKAGE, PRETEST AND ABSOLUTE PENALTY ESTIMATORS IN PARTIALLY LINEAR MODELS, The sliding Frank–Wolfe algorithm and its application to super-resolution microscopy, Optimal $k$-Thresholding Algorithms for Sparse Optimization Problems, PERFORMANCE LIMITS FOR ESTIMATORS OF THE RISK OR DISTRIBUTION OF SHRINKAGE-TYPE ESTIMATORS, AND SOME GENERAL LOWER RISK-BOUND RESULTS, 9 Kernel methods for surrogate modeling, Generalized Conjugate Gradient Methods for ℓ1 Regularized Convex Quadratic Programming with Finite Convergence, Sparse Recovery via Partial Regularization: Models, Theory, and Algorithms, Weak Stability of ℓ1-Minimization Methods in Sparse Data Reconstruction, Frames for compressed sensing using coherence, Tensor Least Angle Regression for Sparse Representations of Multidimensional Signals, A Block Successive Upper-Bound Minimization Method of Multipliers for Linearly Constrained Convex Optimization, Particle swarm stepwise (PaSS) algorithm for information criteria-based variable selections, An ℓ<sub>q</sub>- Seminorm Variational Model for Impulse Noise Reduction, Estimation of Sparse Nonnegative Sources from Noisy Overcomplete Mixtures Using MAP, A review on restoration of seismic wavefields based on regularization and compressive sensing, Reproducing Kernel Banach Spaces with the ℓ1 Norm II: Error Analysis for Regularized Least Square Regression, ℓ 1 − αℓ 2 minimization methods for signal and image reconstruction with impulsive noise removal, Necessary and Sufficient Conditions for Noiseless Sparse Recovery via Convex Quadratic Splines, Unnamed Item, A tale of three probabilistic families: Discriminative, descriptive, and generative models, A NEW APPROACH TO SELECT THE BEST SUBSET OF PREDICTORS IN LINEAR REGRESSION MODELLING: BI-OBJECTIVE MIXED INTEGER LINEAR PROGRAMMING, Extended Dai-Yuan conjugate gradient strategy for large-scale unconstrained optimization with applications to compressive sensing, Unconstrained \(\ell_1\)-\(\ell_2\) minimization for sparse recovery via mutual coherence, Sparse solutions to an underdetermined system of linear equations via penalized Huber loss, Quantile regression for longitudinal data, Surprises in high-dimensional ridgeless least squares interpolation, A solution approach for cardinality minimization problem based on fractional programming, Iterative method with inertial terms for nonexpansive mappings: applications to compressed sensing, CLOT norm minimization for continuous hands-off control, A hybrid quasi-Newton projected-gradient method with application to lasso and basis-pursuit denoising, Density-ratio matching under the Bregman divergence: a unified framework of density-ratio estimation, An iterative algorithm for fitting nonconvex penalized generalized linear models with grouped predictors, Linear program relaxation of sparse nonnegative recovery in compressive sensing microarrays, Strengthening hash families and compressive sensing, Sparsity driven people localization with a heterogeneous network of cameras, A polynomial chaos framework for probabilistic predictions of storm surge events, A smoothing method for sparse optimization over convex sets, Sparse recovery with integrality constraints, Adaptive spatial-spectral dictionary learning for hyperspectral image restoration, Compressed sensing and matrix completion with constant proportion of corruptions, Block-sparse recovery of semidefinite systems and generalized null space conditions, An iterative algorithm for sparse and constrained recovery with applications to divergence-free current reconstructions in magneto-encephalography, A safe reinforced feature screening strategy for Lasso based on feasible solutions, Atomic norm minimization for decomposition into complex exponentials and optimal transport in Fourier domain, The Lasso problem and uniqueness, Processing of sparse signals and mutual coherence of ``measurable vectors, Optimization methods for regularization-based ill-posed problems: a survey and a multi-objective framework, A short note on compressed sensing with partially known signal support, Linear convergence of the randomized sparse Kaczmarz method, Efficient LED-SAC sparse estimator using fast sequential adaptive coordinate-wise optimization (LED-2SAC), TV+TV regularization with nonconvex sparseness-inducing penalty for image restoration, The smoothing FR conjugate gradient method for solving a kind of nonsmooth optimization problem with \(l_1\)-norm, A two-stage regularization method for variable selection and forecasting in high-order interaction model, Nonmonotone adaptive Barzilai-Borwein gradient algorithm for compressed sensing, New iterative regularization methods for the multiple-sets split feasibility problem, Best subset, forward stepwise or Lasso? Analysis and recommendations based on extensive comparisons, A new piecewise quadratic approximation approach for \(L_0\) norm minimization problem, Maximal solutions of sparse analysis regularization, Signal recovery under mutual incoherence property and oracle inequalities, A perturbation view of level-set methods for convex optimization, Oscillation of metropolis-Hastings and simulated annealing algorithms around LASSO estimator, A distributed one-step estimator, Level-set methods for convex optimization, Learning semidefinite regularizers, Efficient histogram dictionary learning for text/image modeling and classification, A modified nonlinear conjugate gradient algorithm for large-scale nonsmooth convex optimization, Sparsity-promoting elastic net method with rotations for high-dimensional nonlinear inverse problem, Existence, uniqueness, and approximation solutions to linearized Chandrasekhar equation with sharp bounds, Combinatorial rigidity of incidence systems and application to dictionary learning, Graph-induced restricted Boltzmann machines for document modeling, Uniqueness of the minimal \(l_1\)-norm solution to the monotone linear complementarity problem, Comparing the estimates of ROC curves by modeling methods, A significance test for the lasso, Discussion: ``A significance test for the lasso, Rejoinder: ``A significance test for the lasso, Optimal estimation of slope vector in high-dimensional linear transformation models, Homogeneous penalizers and constraints in convex image restoration, Incrementally updated gradient methods for constrained and regularized optimization, Spectral compressive sensing, Data-driven time-frequency analysis, Local behavior of sparse analysis regularization: applications to risk estimation, Stable restoration and separation of approximately sparse signals, Penalized regression combining the \( L_{1}\) norm and a correlation based penalty, A performance guarantee for orthogonal matching pursuit using mutual coherence, Sparse approximate reconstruction decomposed by two optimization problems, Iterative algorithms for total variation-like reconstructions in seismic tomography, Analysis of the self projected matching pursuit algorithm, Compressed solving: a numerical approximation technique for elliptic PDEs based on compressed sensing, Computing the spark: mixed-integer programming for the (vector) matroid girth problem, Necessary and sufficient conditions of solution uniqueness in 1-norm minimization, Iterative reweighted noninteger norm regularizing SVM for gene expression data classification, Energy preserved sampling for compressed sensing MRI, Properties and iterative methods for the \(Q\)-lasso, A hybrid scheme for encoding audio signal using hidden Markov models of waveforms, Complexity and applications of the homotopy principle for uniformly constrained sparse minimization, Sequential sparse Bayesian learning with applications to system identification for damage assessment and recursive reconstruction of image sequences, Homologous control of protein signaling networks, Recovery of seismic wavefields by an \(l_{q}\)-norm constrained regularization method, Model recovery for Hammerstein systems using the hierarchical orthogonal matching pursuit method, Underdetermined separation of speech mixture based on sparse Bayesian learning, A study on distributed optimization over large-scale networked systems, Time-scale-chirp\(_-\)rate operator for recovery of non-stationary signal components with crossover instantaneous frequency curves, A unified primal dual active set algorithm for nonconvex sparse recovery, Augmented sparse reconstruction of protein signaling networks, Modified forward-backward splitting method for variational inclusions, Structured overcomplete sparsifying transform learning with convergence guarantees and applications, Toward fast transform learning, Regularization parameter selection for the low rank matrix recovery, A modulus-based iterative method for sparse signal recovery, An inexact proximal augmented Lagrangian framework with arbitrary linearly convergent inner solver for composite convex optimization, Sampling from non-smooth distributions through Langevin diffusion, Overcomplete representation in a hierarchical Bayesian framework, Super-resolution for doubly-dispersive channel estimation, Robust sparse recovery via a novel convex model, Learning ``best kernels from data in Gaussian process regression. With application to aerodynamics, On the robustness of minimum norm interpolators and regularized empirical risk minimizers, Asymptotics for Lasso-type estimators., Penalized wavelet estimation and robust denoising for irregular spaced data, AdaBoost and robust one-bit compressed sensing, GRPDA revisited: relaxed condition and connection to Chambolle-Pock's primal-dual algorithm, A phase transition for finding needles in nonlinear haystacks with LASSO artificial neural networks, Sparse recovery of sound fields using measurements from moving microphones
Uses Software