A family of second-order methods for convex _1-regularized optimization
DOI10.1007/S10107-015-0965-3zbMATH Open1350.49046OpenAlexW2287259016MaRDI QIDQ312690FDOQ312690
Authors: Gillian M. Chin, Figen Oztoprak, R. H. Byrd, Jorge Nocedal
Publication date: 16 September 2016
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10107-015-0965-3
Recommendations
- A second-order method for convex \(\ell_1\)-regularized optimization with active-set prediction
- An algorithm for quadratic \(\ell_1\)-regularized optimization with a flexible active-set strategy
- A second-order method for strongly convex \(\ell _1\)-regularization problems
- Second-order orthant-based methods with enriched Hessian information for sparse \(\ell _1\)-optimization
- A reduced-space algorithm for minimizing \(\ell_1\)-regularized convex functions
convex optimizationsecond-order methods\(\ell _1\)-regularizationblock active set methodorthant based methodquadratic problemssemismooth Newton method
Numerical mathematical programming methods (65K05) Convex programming (90C25) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37) Newton-type methods (49M15)
Cites Work
- CUTEr and SifDec
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A fast algorithm for sparse reconstruction based on shrinkage, subspace optimization, and continuation
- CUTE
- Numerical Optimization
- Benchmarking optimization software with performance profiles.
- Introductory lectures on convex optimization. A basic course.
- Sample size selection in optimization methods for machine learning
- Fast Image Recovery Using Variable Splitting and Constrained Optimization
- De-noising by soft-thresholding
- Sparse Reconstruction by Separable Approximation
- A nonsmooth version of Newton's method
- Nonlinear optimization.
- Fixed-Point Continuation for $\ell_1$-Minimization: Methodology and Convergence
- Semismooth and Semiconvex Functions in Constrained Optimization
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- Convergence Analysis of Some Algorithms for Solving Nonsmooth Equations
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- Title not available (Why is that?)
- An Infeasible Primal-Dual Algorithm for Total Bounded Variation--Based Inf-Convolution-Type Image Restoration
- Dual averaging methods for regularized stochastic learning and online optimization
- Primal-Dual Strategy for Constrained Optimal Control Problems
- The Primal-Dual Active Set Strategy as a Semismooth Newton Method
- A SQP-Semismooth Newton-type Algorithm applied to Control of the instationary Navier--Stokes System Subject to Control Constraints
- Efficient online and batch learning using forward backward splitting
- Error bounds and convergence analysis of feasible descent methods: A general approach
- An inexact successive quadratic approximation method for L-1 regularized optimization
- A semismooth Newton method with multidimensional filter globalization for \(l_1\)-optimization
- A short proof of finiteness of Murty's principal pivoting algorithm
- A block principal pivoting algorithm for large-scale strictly monotone linear complementarity problems
- A feasible active set method for strictly convex quadratic problems with simple bounds
- Newton's method for linear complementarity problems
- An algorithm for quadratic \(\ell_1\)-regularized optimization with a flexible active-set strategy
- A semismooth Newton method for Tikhonov functionals with sparsity constraints
- Projected Newton Methods for Optimization Problems with Simple Constraints
- Global convergence of damped semismooth Newton methods for \(\ell_{1}\) Tikhonov regularization
- Nonconvergence of the plain Newton-min algorithm for linear complementarity problems with a \(P\)-matrix
- Elliptic optimal control problems with \(L^1\)-control cost and applications for the placement of control devices
Cited In (34)
- A block principal pivoting algorithm for vertical generalized LCP with a vertical block P-matrix
- A second-order method for strongly convex \(\ell _1\)-regularization problems
- A highly efficient semismooth Newton augmented Lagrangian method for solving lasso problems
- Cardinality minimization, constraints, and regularization: a survey
- A fast active set block coordinate descent algorithm for \(\ell_1\)-regularized least squares
- Second-order orthant-based methods with enriched Hessian information for sparse \(\ell _1\)-optimization
- An efficient proximal block coordinate homotopy method for large-scale sparse least squares problems
- Optimization methods for large-scale machine learning
- A limited-memory quasi-Newton algorithm for bound-constrained non-smooth optimization
- An active-set proximal quasi-Newton algorithm for ℓ1-regularized minimization over a sphere constraint
- An inexact successive quadratic approximation method for L-1 regularized optimization
- Further properties of the forward-backward envelope with applications to difference-of-convex programming
- An extension of the second order dynamical system that models Nesterov's convex gradient method
- A second-order method for convex \(\ell_1\)-regularized optimization with active-set prediction
- An inexact semismooth Newton method on Riemannian manifolds with application to duality-based total variation denoising
- Practical inexact proximal quasi-Newton method with global complexity analysis
- Inexact proximal stochastic second-order methods for nonconvex composite optimization
- Stochastic proximal quasi-Newton methods for non-convex composite optimization
- An Algorithmic Characterization of P-matricity II: Adjustments, Refinements, and Validation
- An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems
- Performance of first- and second-order methods for \(\ell_1\)-regularized least squares problems
- An active set Newton-CG method for \(\ell_1\) optimization
- Superfast second-order methods for unconstrained convex optimization
- A regularized semi-smooth Newton method with projection steps for composite convex programs
- An iterative reduction FISTA algorithm for large-scale LASSO
- A globally convergent primal-dual active-set framework for large-scale convex quadratic optimization
- A concise second-order complexity analysis for unconstrained optimization using high-order regularized models
- A feasible active set method for strictly convex quadratic problems with simple bounds
- An algorithm for quadratic \(\ell_1\)-regularized optimization with a flexible active-set strategy
- A decomposition method for Lasso problems with zero-sum constraint
- A fast conjugate gradient algorithm with active set prediction for ℓ1 optimization
- An active-set proximal-Newton algorithm for \(\ell_1\) regularized optimization problems with box constraints
- A reduced-space algorithm for minimizing \(\ell_1\)-regularized convex functions
- FarRSA for \(\ell_1\)-regularized convex optimization: local convergence and numerical experience
Uses Software
This page was built for publication: A family of second-order methods for convex \(\ell _1\)-regularized optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q312690)