A family of second-order methods for convex \(\ell _1\)-regularized optimization
From MaRDI portal
Publication:312690
DOI10.1007/s10107-015-0965-3zbMath1350.49046OpenAlexW2287259016MaRDI QIDQ312690
Figen Oztoprak, Gillian M. Chin, Byrd, Richard H., Nocedal, Jorge
Publication date: 16 September 2016
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10107-015-0965-3
convex optimizationsecond-order methods\(\ell _1\)-regularizationblock active set methodorthant based methodquadratic problemssemismooth Newton method
Numerical mathematical programming methods (65K05) Convex programming (90C25) Nonlinear programming (90C30) Newton-type methods (49M15) Numerical methods based on nonlinear programming (49M37)
Related Items
Further properties of the forward-backward envelope with applications to difference-of-convex programming, An inexact successive quadratic approximation method for L-1 regularized optimization, Practical inexact proximal quasi-Newton method with global complexity analysis, An active set Newton-CG method for \(\ell_1\) optimization, A regularized semi-smooth Newton method with projection steps for composite convex programs, An Iterative Reduction FISTA Algorithm for Large-Scale LASSO, A Feasible Active Set Method for Strictly Convex Quadratic Problems with Simple Bounds, A Reduced-Space Algorithm for Minimizing $\ell_1$-Regularized Convex Functions, A Highly Efficient Semismooth Newton Augmented Lagrangian Method for Solving Lasso Problems, An active-set proximal-Newton algorithm for \(\ell_1\) regularized optimization problems with box constraints, Inexact proximal stochastic second-order methods for nonconvex composite optimization, FaRSA for ℓ1-regularized convex optimization: local convergence and numerical experience, Optimization Methods for Large-Scale Machine Learning, A limited-memory quasi-Newton algorithm for bound-constrained non-smooth optimization, Second-order orthant-based methods with enriched Hessian information for sparse \(\ell _1\)-optimization, Stochastic proximal quasi-Newton methods for non-convex composite optimization, A Fast Active Set Block Coordinate Descent Algorithm for $\ell_1$-Regularized Least Squares, An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems, An Efficient Proximal Block Coordinate Homotopy Method for Large-Scale Sparse Least Squares Problems, A block principal pivoting algorithm for vertical generalized LCP with a vertical block P-matrix, An Algorithmic Characterization of P-matricity II: Adjustments, Refinements, and Validation, A fast conjugate gradient algorithm with active set prediction for ℓ1 optimization, A globally convergent primal-dual active-set framework for large-scale convex quadratic optimization, An Inexact Semismooth Newton Method on Riemannian Manifolds with Application to Duality-Based Total Variation Denoising, A decomposition method for Lasso problems with zero-sum constraint, An active-set proximal quasi-Newton algorithm for ℓ1-regularized minimization over a sphere constraint
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- An inexact successive quadratic approximation method for L-1 regularized optimization
- Nonconvergence of the plain Newton-min algorithm for linear complementarity problems with a \(P\)-matrix
- Sample size selection in optimization methods for machine learning
- Elliptic optimal control problems with \(L^1\)-control cost and applications for the placement of control devices
- A short proof of finiteness of Murty's principal pivoting algorithm
- Error bounds and convergence analysis of feasible descent methods: A general approach
- A block principal pivoting algorithm for large-scale strictly monotone linear complementarity problems
- Introductory lectures on convex optimization. A basic course.
- A nonsmooth version of Newton's method
- A Feasible Active Set Method for Strictly Convex Quadratic Problems with Simple Bounds
- A Fast Algorithm for Sparse Reconstruction Based on Shrinkage, Subspace Optimization, and Continuation
- Newton's method for linear complementarity problems
- Fixed-Point Continuation for $\ell_1$-Minimization: Methodology and Convergence
- An algorithm for quadratic ℓ1-regularized optimization with a flexible active-set strategy
- A semismooth Newton method for Tikhonov functionals with sparsity constraints
- Semismooth and Semiconvex Functions in Constrained Optimization
- Primal-Dual Strategy for Constrained Optimal Control Problems
- Numerical Optimization
- CUTE
- The Primal-Dual Active Set Strategy as a Semismooth Newton Method
- Sparse Reconstruction by Separable Approximation
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- Convergence Analysis of Some Algorithms for Solving Nonsmooth Equations
- Projected Newton Methods for Optimization Problems with Simple Constraints
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- De-noising by soft-thresholding
- A Semismooth Newton Method with Multidimensional Filter Globalization for $l_1$-Optimization
- Global convergence of damped semismooth Newton methods for ℓ 1 Tikhonov regularization
- Fast Image Recovery Using Variable Splitting and Constrained Optimization
- CUTEr and SifDec
- A SQP-Semismooth Newton-type Algorithm applied to Control of the instationary Navier--Stokes System Subject to Control Constraints
- An Infeasible Primal-Dual Algorithm for Total Bounded Variation--Based Inf-Convolution-Type Image Restoration
- Benchmarking optimization software with performance profiles.