A Fast Algorithm for Sparse Reconstruction Based on Shrinkage, Subspace Optimization, and Continuation
From MaRDI portal
Publication:2998011
DOI10.1137/090747695zbMath1215.49039OpenAlexW2022611944MaRDI QIDQ2998011
Wotao Yin, Donald Goldfarb, ZaiWen Wen, Yin Zhang
Publication date: 17 May 2011
Published in: SIAM Journal on Scientific Computing (Search for Journal in Brave)
Full work available at URL: https://semanticscholar.org/paper/1e493bb0848b8a5e4003f3d98aa084462df78cad
continuationshrinkageactive setcompressive sensing\(\ell_1\)-minimizationbasis pursuitsubspace optimization
Numerical mathematical programming methods (65K05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06)
Related Items
Forward-backward quasi-Newton methods for nonsmooth optimization problems, Iteratively weighted thresholding homotopy method for the sparse solution of underdetermined linear equations, Sparse solutions to an underdetermined system of linear equations via penalized Huber loss, A proximal method for composite minimization, Compressive Sensing with Cross-Validation and Stop-Sampling for Sparse Polynomial Chaos Expansions, On the convergence of an active-set method for ℓ1minimization, A truncated Newton algorithm for nonconvex sparse recovery, A family of second-order methods for convex \(\ell _1\)-regularized optimization, An algorithm for quadratic ℓ1-regularized optimization with a flexible active-set strategy, A survey on compressive sensing: classical results and recent advancements, A stochastic extra-step quasi-Newton method for nonsmooth nonconvex optimization, An active set algorithm for nonlinear optimization with polyhedral constraints, An interior stochastic gradient method for a class of non-Lipschitz optimization problems, An active set Newton-CG method for \(\ell_1\) optimization, A regularized semi-smooth Newton method with projection steps for composite convex programs, ParNes: A rapidly convergent algorithm for accurate recovery of sparse and approximately sparse signals, An Iterative Reduction FISTA Algorithm for Large-Scale LASSO, Global convergence of proximal iteratively reweighted algorithm, An inexact quasi-Newton algorithm for large-scale \(\ell_1\) optimization with box constraints, Adaptive projected gradient thresholding methods for constrained \(l_0\) problems, A preconditioned conjugate gradient method with active set strategy for \(\ell_1\)-regularized least squares, A penalty-free infeasible approach for a class of nonsmooth optimization problems over the Stiefel manifold, A simple homotopy proximal mapping algorithm for compressive sensing, An extrapolated proximal iteratively reweighted method for nonconvex composite optimization problems, Proximal gradient/semismooth Newton methods for projection onto a polyhedron via the duality-gap-active-set strategy, A cyclic projected gradient method, Gradient-based method with active set strategy for $\ell _1$ optimization, A Highly Efficient Semismooth Newton Augmented Lagrangian Method for Solving Lasso Problems, New nonsmooth equations-based algorithms for \(\ell_1\)-norm minimization and applications, Heat source identification based on \(\ell_1\) constrained minimization, An active-set proximal-Newton algorithm for \(\ell_1\) regularized optimization problems with box constraints, Nonmonotone Barzilai-Borwein gradient algorithm for \(\ell_1\)-regularized nonsmooth minimization in compressive sensing, A variable fixing version of the two-block nonlinear constrained Gauss-Seidel algorithm for \(\ell_1\)-regularized least-squares, Equivalence and strong equivalence between the sparsest and least \(\ell _1\)-norm nonnegative solutions of linear systems and their applications, A new generalized shrinkage conjugate gradient method for sparse recovery, A Barzilai-Borwein type method for minimizing composite functions, A pseudo-heuristic parameter selection rule for \(l^1\)-regularized minimization problems, An efficient adaptive forward-backward selection method for sparse polynomial chaos expansion, Decomposable norm minimization with proximal-gradient homotopy algorithm, Stochastic variance reduced gradient methods using a trust-region-like scheme, IMRO: A Proximal Quasi-Newton Method for Solving $\ell_1$-Regularized Least Squares Problems, Templates for convex cone problems with applications to sparse signal recovery, Fixed point and Bregman iterative methods for matrix rank minimization, Sequential sparse Bayesian learning with applications to system identification for damage assessment and recursive reconstruction of image sequences, New augmented Lagrangian-based proximal point algorithm for convex optimization with equality constraints, A Fast Active Set Block Coordinate Descent Algorithm for $\ell_1$-Regularized Least Squares, Linearized alternating directions method for \(\ell_1\)-norm inequality constrained \(\ell_1\)-norm minimization, A Smoothing Active Set Method for Linearly Constrained Non-Lipschitz Nonconvex Optimization, Matrix-free interior point method for compressed sensing problems, An Efficient Proximal Block Coordinate Homotopy Method for Large-Scale Sparse Least Squares Problems, Search Direction Correction with Normalized Gradient Makes First-Order Methods Faster, A second-order method for convex1-regularized optimization with active-set prediction, An active set Barzilar-Borwein algorithm for \(l_0\) regularized optimization, FPC_AS, ``Active-set complexity of proximal gradient: how long does it take to find the sparsity pattern?, Splitting augmented Lagrangian method for optimization problems with a cardinality constraint and semicontinuous variables, A Multilevel Framework for Sparse Optimization with Application to Inverse Covariance Estimation and Logistic Regression, Efficient Sparse Semismooth Newton Methods for the Clustered Lasso Problem, On Quasi-Newton Forward-Backward Splitting: Proximal Calculus and Convergence, A fast conjugate gradient algorithm with active set prediction for ℓ1 optimization, A Hybrid Finite-Dimensional RHC for Stabilization of Time-Varying Parabolic Equations, Sparse Solutions by a Quadratically Constrained ℓq (0 <q< 1) Minimization Model, A Proximal Gradient Method for Ensemble Density Functional Theory, A Three-Operator Splitting Algorithm for Nonconvex Sparsity Regularization, Minimization of $\ell_{1-2}$ for Compressed Sensing, A decomposition method for Lasso problems with zero-sum constraint, Exact Penalty Function for $\ell_{2,1}$ Norm Minimization over the Stiefel Manifold, An Algorithm Solving Compressive Sensing Problem Based on Maximal Monotone Operators, A Dimension Reduction Technique for Large-Scale Structured Sparse Optimization Problems with Application to Convex Clustering, An active-set proximal quasi-Newton algorithm for ℓ1-regularized minimization over a sphere constraint, An adaptive accelerated proximal gradient method and its homotopy continuation for sparse optimization, Solving Basis Pursuit, Nomonotone spectral gradient method for sparse recovery, Combining line search and trust-region methods forℓ1-minimization
Uses Software