A Proximal-Gradient Homotopy Method for the Sparse Least-Squares Problem
From MaRDI portal
Publication:2848186
DOI10.1137/120869997zbMath1280.65057arXiv1203.3002OpenAlexW2161227280MaRDI QIDQ2848186
Publication date: 25 September 2013
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1203.3002
complexityregularizationnumerical examplesleast squares problemhomotopy continuation methodsparse optimizationproximal gradient method
Numerical mathematical programming methods (65K05) Convex programming (90C25) Complexity and performance of numerical algorithms (65Y20)
Related Items (23)
Iteratively weighted thresholding homotopy method for the sparse solution of underdetermined linear equations ⋮ A data-driven line search rule for support recovery in high-dimensional data analysis ⋮ A unified approach to error bounds for structured convex optimization problems ⋮ Fast and Reliable Parameter Estimation from Nonlinear Observations ⋮ A primal dual active set with continuation algorithm for the \(\ell^0\)-regularized optimization problem ⋮ Randomized Block Proximal Damped Newton Method for Composite Self-Concordant Minimization ⋮ Accelerate the warm-up stage in the Lasso computation via a homotopic approach ⋮ A simple homotopy proximal mapping algorithm for compressive sensing ⋮ Optimal computational and statistical rates of convergence for sparse nonconvex learning problems ⋮ A fast homotopy algorithm for gridless sparse recovery ⋮ Decomposable norm minimization with proximal-gradient homotopy algorithm ⋮ Unnamed Item ⋮ Linear convergence of inexact descent method and inexact proximal gradient algorithms for lower-order regularization problems ⋮ On Convergence Rates of Linearized Proximal Algorithms for Convex Composite Optimization with Applications ⋮ New analysis of linear convergence of gradient-type methods via unifying error bound conditions ⋮ Generalized Conjugate Gradient Methods for ℓ1 Regularized Convex Quadratic Programming with Finite Convergence ⋮ Variational analysis perspective on linear convergence of some first order methods for nonsmooth convex optimization problems ⋮ Unnamed Item ⋮ Weighted thresholding homotopy method for sparsity constrained optimization ⋮ High-dimensional model recovery from random sketched data by exploring intrinsic sparsity ⋮ Nonregular and minimax estimation of individualized thresholds in high dimension with binary responses ⋮ An adaptive accelerated proximal gradient method and its homotopy continuation for sparse optimization ⋮ Perturbation techniques for convergence analysis of proximal gradient method and other first-order algorithms via variational analysis
Uses Software
This page was built for publication: A Proximal-Gradient Homotopy Method for the Sparse Least-Squares Problem