An inexact successive quadratic approximation method for L-1 regularized optimization

From MaRDI portal
Publication:301652

DOI10.1007/s10107-015-0941-yzbMath1342.49037arXiv1309.3529OpenAlexW2162870776MaRDI QIDQ301652

Figen Oztoprak, Byrd, Richard H., Nocedal, Jorge

Publication date: 1 July 2016

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1309.3529



Related Items

An inexact successive quadratic approximation method for a class of difference-of-convex optimization problems, Performance of first- and second-order methods for \(\ell_1\)-regularized least squares problems, Second order semi-smooth proximal Newton methods in Hilbert spaces, Adaptive Quadratically Regularized Newton Method for Riemannian Optimization, A family of second-order methods for convex \(\ell _1\)-regularized optimization, A flexible coordinate descent method, Global convergence rate analysis of unconstrained optimization methods based on probabilistic models, Practical inexact proximal quasi-Newton method with global complexity analysis, An active set Newton-CG method for \(\ell_1\) optimization, An Iterative Reduction FISTA Algorithm for Large-Scale LASSO, A Reduced-Space Algorithm for Minimizing $\ell_1$-Regularized Convex Functions, Inexact successive quadratic approximation for regularized optimization, An inexact quasi-Newton algorithm for large-scale \(\ell_1\) optimization with box constraints, Concave Likelihood-Based Regression with Finite-Support Response Variables, Inexact proximal DC Newton-type method for nonconvex composite functions, A globally convergent proximal Newton-type method in nonsmooth convex optimization, Accelerating inexact successive quadratic approximation for regularized optimization through manifold identification, Inexact proximal Newton methods in Hilbert spaces, Global complexity analysis of inexact successive quadratic approximation methods for regularized optimization under mild assumptions, Local convergence analysis of an inexact trust-region method for nonsmooth optimization, A Highly Efficient Semismooth Newton Augmented Lagrangian Method for Solving Lasso Problems, A Unified Adaptive Tensor Approximation Scheme to Accelerate Composite Convex Optimization, An active-set proximal-Newton algorithm for \(\ell_1\) regularized optimization problems with box constraints, Inexact proximal stochastic second-order methods for nonconvex composite optimization, Nonsmooth optimization using Taylor-like models: error bounds, convergence, and termination criteria, Unnamed Item, Sub-sampled Newton methods, A family of inexact SQA methods for non-smooth convex minimization with provable convergence guarantees based on the Luo-Tseng error bound property, FaRSA for ℓ1-regularized convex optimization: local convergence and numerical experience, Optimization Methods for Large-Scale Machine Learning, Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates, Inexact proximal Newton methods for self-concordant functions, Linear convergence of inexact descent method and inexact proximal gradient algorithms for lower-order regularization problems, Globalized inexact proximal Newton-type methods for nonconvex composite functions, Stochastic proximal quasi-Newton methods for non-convex composite optimization, An Efficient Proximal Block Coordinate Homotopy Method for Large-Scale Sparse Least Squares Problems, An Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton Acceleration, Fused Multiple Graphical Lasso, An Inexact Semismooth Newton Method on Riemannian Manifolds with Application to Duality-Based Total Variation Denoising, An active-set proximal quasi-Newton algorithm for ℓ1-regularized minimization over a sphere constraint


Uses Software


Cites Work