A unified primal dual active set algorithm for nonconvex sparse recovery
From MaRDI portal
Publication:2038299
DOI10.1214/19-STS758OpenAlexW3154413294MaRDI QIDQ2038299
Xiliang Lu, Bangti Jin, Can Yang, Jian Huang, Jin Liu, Yu Ling Jiao
Publication date: 6 July 2021
Published in: Statistical Science (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1214/19-sts758
Related Items
GSDAR: a fast Newton algorithm for \(\ell_0\) regularized generalized linear models with statistical guarantee, A truncated Newton algorithm for nonconvex sparse recovery, Hard Thresholding Regularised Logistic Regression: Theory and Algorithms, A data-driven line search rule for support recovery in high-dimensional data analysis, Sparse signal recovery from phaseless measurements via hard thresholding pursuit, A communication-efficient method for ℓ0 regularization linear regression models, A primal dual active set with continuation algorithm for high-dimensional nonconvex SICA-penalized regression, A singular value shrinkage thresholding algorithm for folded concave penalized low-rank matrix optimization problems, A global two-stage algorithm for non-convex penalized high-dimensional linear regression problems, A primal and dual active set algorithm for truncated \(L_1\) regularized logistic regression, Smoothing Newton method for \(\ell^0\)-\(\ell^2\) regularized linear inverse problem, High-performance statistical computing in the computing environments of the 2020s, High-dimensional linear regression with hard thresholding regularization: theory and algorithm
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
- Nearly unbiased variable selection under minimax concave penalty
- Global solutions to folded concave penalized nonconvex learning
- Best subset selection via a modern optimization lens
- Recovery of sparsest signals via \(\ell^q \)-minimization
- Iterative reweighted minimization methods for \(l_p\) regularized unconstrained nonlinear programming
- Optimal computational and statistical rates of convergence for sparse nonconvex learning problems
- Smoothing methods for nonsmooth, nonconvex minimization
- Iterative thresholding for sparse approximations
- A primal dual active set with continuation algorithm for the \(\ell^0\)-regularized optimization problem
- One-step sparse estimates in nonconcave penalized likelihood models
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- CoSaMP: Iterative signal recovery from incomplete and inaccurate samples
- Sparsest solutions of underdetermined linear systems via \( \ell _q\)-minimization for \(0<q\leqslant 1\)
- Heuristics of instability and stabilization in model selection
- Asymptotics for Lasso-type estimators.
- Nonconcave penalized likelihood with a diverging number of parameters.
- Thresholding-based iterative selection procedures for model selection and shrinkage
- Sorted concave penalized regression
- Minimization of non-smooth, non-convex functionals by iterative thresholding
- Asymptotic properties of bridge estimators in sparse high-dimensional regression models
- A superlinearly convergent \(R\)-regularized Newton scheme for variational models with concave sparsity-promoting priors
- Calibrating nonconvex penalized regression in ultra-high dimension
- High-dimensional graphs and variable selection with the Lasso
- Strong oracle optimality of folded concave penalized estimation
- Improved Iteratively Reweighted Least Squares for Unconstrained Smoothed $\ell_q$ Minimization
- Optimality Conditions and a Smoothing Trust Region Newton Method for NonLipschitz Optimization
- Description of the Minimizers of Least Squares Regularized with $\ell_0$-norm. Uniqueness of the Global Minimizer
- An Unconstrained $\ell_q$ Minimization with $0q\leq1$ for Sparse Solution of Underdetermined Linear Systems
- Recovery of sparse signals using OMP and its variants: convergence analysis based on RIP
- SparseNet: Coordinate Descent With Nonconvex Penalties
- Robust Decoding from 1-Bit Compressive Sampling with Ordinary and Regularized Least Squares
- A semismooth Newton method for Tikhonov functionals with sparsity constraints
- Restricted isometry properties and nonconvex compressive sensing
- Lagrange Multiplier Approach to Variational Problems and Applications
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Decoding by Linear Programming
- Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit
- Atomic Decomposition by Basis Pursuit
- The Primal-Dual Active Set Strategy as a Semismooth Newton Method
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Smoothing Methods and Semismooth Methods for Nondifferentiable Operator Equations
- Recovering Sparse Signals With a Certain Family of Nonconvex Penalties and DC Programming
- A Primal Dual Active Set Algorithm With Continuation for Compressed Sensing
- A Highly Efficient Semismooth Newton Augmented Lagrangian Method for Solving Lasso Problems
- A Statistical View of Some Chemometrics Regression Tools
- Reweighted $\ell_1$-Minimization for Sparse Solutions to Underdetermined Linear Systems
- A Remark on the Restricted Isometry Property in Orthogonal Matching Pursuit
- A variational approach to sparsity optimization based on Lagrange multiplier theory
- Signal Recovery by Proximal Forward-Backward Splitting
- Regularized M-estimators with nonconvexity: Statistical and algorithmic theory for local optima
- Compressed sensing
- Complex wavelets for shift invariant analysis and filtering of signals
- Convergence of a block coordinate descent method for nondifferentiable minimization
- A general theory of concave regularization for high-dimensional sparse estimation problems
- A new look at the statistical model identification