A family of second-order methods for convex _1-regularized optimization
From MaRDI portal
Publication:312690
Recommendations
- A second-order method for convex \(\ell_1\)-regularized optimization with active-set prediction
- An algorithm for quadratic \(\ell_1\)-regularized optimization with a flexible active-set strategy
- A second-order method for strongly convex \(\ell _1\)-regularization problems
- Second-order orthant-based methods with enriched Hessian information for sparse \(\ell _1\)-optimization
- A reduced-space algorithm for minimizing \(\ell_1\)-regularized convex functions
Cites work
- scientific article; zbMATH DE number 5937962 (Why is no real title available?)
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A SQP-Semismooth Newton-type Algorithm applied to Control of the instationary Navier--Stokes System Subject to Control Constraints
- A block principal pivoting algorithm for large-scale strictly monotone linear complementarity problems
- A fast algorithm for sparse reconstruction based on shrinkage, subspace optimization, and continuation
- A feasible active set method for strictly convex quadratic problems with simple bounds
- A nonsmooth version of Newton's method
- A semismooth Newton method for Tikhonov functionals with sparsity constraints
- A semismooth Newton method with multidimensional filter globalization for \(l_1\)-optimization
- A short proof of finiteness of Murty's principal pivoting algorithm
- An Infeasible Primal-Dual Algorithm for Total Bounded Variation--Based Inf-Convolution-Type Image Restoration
- An algorithm for quadratic \(\ell_1\)-regularized optimization with a flexible active-set strategy
- An inexact successive quadratic approximation method for L-1 regularized optimization
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- Benchmarking optimization software with performance profiles.
- CUTE
- CUTEr and SifDec
- Convergence Analysis of Some Algorithms for Solving Nonsmooth Equations
- De-noising by soft-thresholding
- Dual averaging methods for regularized stochastic learning and online optimization
- Efficient online and batch learning using forward backward splitting
- Elliptic optimal control problems with L^1-control cost and applications for the placement of control devices
- Error bounds and convergence analysis of feasible descent methods: A general approach
- Fast Image Recovery Using Variable Splitting and Constrained Optimization
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- Fixed-Point Continuation for $\ell_1$-Minimization: Methodology and Convergence
- Global convergence of damped semismooth Newton methods for \(\ell_{1}\) Tikhonov regularization
- Introductory lectures on convex optimization. A basic course.
- Newton's method for linear complementarity problems
- Nonconvergence of the plain Newton-min algorithm for linear complementarity problems with a P-matrix
- Nonlinear optimization.
- Numerical Optimization
- Primal-Dual Strategy for Constrained Optimal Control Problems
- Projected Newton Methods for Optimization Problems with Simple Constraints
- Sample size selection in optimization methods for machine learning
- Semismooth and Semiconvex Functions in Constrained Optimization
- Sparse Reconstruction by Separable Approximation
- The Primal-Dual Active Set Strategy as a Semismooth Newton Method
Cited in
(34)- A block principal pivoting algorithm for vertical generalized LCP with a vertical block P-matrix
- A second-order method for strongly convex \(\ell _1\)-regularization problems
- A highly efficient semismooth Newton augmented Lagrangian method for solving lasso problems
- A fast active set block coordinate descent algorithm for \(\ell_1\)-regularized least squares
- Second-order orthant-based methods with enriched Hessian information for sparse \(\ell _1\)-optimization
- Cardinality minimization, constraints, and regularization: a survey
- An efficient proximal block coordinate homotopy method for large-scale sparse least squares problems
- Optimization methods for large-scale machine learning
- A limited-memory quasi-Newton algorithm for bound-constrained non-smooth optimization
- An active-set proximal quasi-Newton algorithm for ℓ1-regularized minimization over a sphere constraint
- An inexact successive quadratic approximation method for L-1 regularized optimization
- Further properties of the forward-backward envelope with applications to difference-of-convex programming
- An extension of the second order dynamical system that models Nesterov's convex gradient method
- A second-order method for convex \(\ell_1\)-regularized optimization with active-set prediction
- Practical inexact proximal quasi-Newton method with global complexity analysis
- An inexact semismooth Newton method on Riemannian manifolds with application to duality-based total variation denoising
- Inexact proximal stochastic second-order methods for nonconvex composite optimization
- Stochastic proximal quasi-Newton methods for non-convex composite optimization
- An Algorithmic Characterization of P-matricity II: Adjustments, Refinements, and Validation
- An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems
- Performance of first- and second-order methods for \(\ell_1\)-regularized least squares problems
- An active set Newton-CG method for \(\ell_1\) optimization
- Superfast second-order methods for unconstrained convex optimization
- A regularized semi-smooth Newton method with projection steps for composite convex programs
- An iterative reduction FISTA algorithm for large-scale LASSO
- A globally convergent primal-dual active-set framework for large-scale convex quadratic optimization
- A feasible active set method for strictly convex quadratic problems with simple bounds
- An algorithm for quadratic \(\ell_1\)-regularized optimization with a flexible active-set strategy
- A concise second-order complexity analysis for unconstrained optimization using high-order regularized models
- A decomposition method for Lasso problems with zero-sum constraint
- A fast conjugate gradient algorithm with active set prediction for ℓ1 optimization
- An active-set proximal-Newton algorithm for \(\ell_1\) regularized optimization problems with box constraints
- FarRSA for \(\ell_1\)-regularized convex optimization: local convergence and numerical experience
- A reduced-space algorithm for minimizing \(\ell_1\)-regularized convex functions
This page was built for publication: A family of second-order methods for convex \(\ell _1\)-regularized optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q312690)