Practical inexact proximal quasi-Newton method with global complexity analysis
From MaRDI portal
Publication:344963
DOI10.1007/s10107-016-0997-3zbMath1366.90166arXiv1311.6547OpenAlexW2962748029MaRDI QIDQ344963
Katya Scheinberg, Xiaocheng Tang
Publication date: 25 November 2016
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1311.6547
Related Items (31)
An inexact successive quadratic approximation method for a class of difference-of-convex optimization problems ⋮ Forward-backward quasi-Newton methods for nonsmooth optimization problems ⋮ Performance of first- and second-order methods for \(\ell_1\)-regularized least squares problems ⋮ Second order semi-smooth proximal Newton methods in Hilbert spaces ⋮ A flexible coordinate descent method ⋮ A Reduced-Space Algorithm for Minimizing $\ell_1$-Regularized Convex Functions ⋮ COAP 2021 best paper prize ⋮ Inexact successive quadratic approximation for regularized optimization ⋮ Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functions ⋮ Inexact proximal stochastic gradient method for convex composite optimization ⋮ Inexact proximal DC Newton-type method for nonconvex composite functions ⋮ A globally convergent proximal Newton-type method in nonsmooth convex optimization ⋮ Accelerating inexact successive quadratic approximation for regularized optimization through manifold identification ⋮ Minimizing oracle-structured composite functions ⋮ Inexact proximal Newton methods in Hilbert spaces ⋮ Global complexity analysis of inexact successive quadratic approximation methods for regularized optimization under mild assumptions ⋮ A proximal quasi-Newton method based on memoryless modified symmetric rank-one formula ⋮ A Unified Adaptive Tensor Approximation Scheme to Accelerate Composite Convex Optimization ⋮ A family of inexact SQA methods for non-smooth convex minimization with provable convergence guarantees based on the Luo-Tseng error bound property ⋮ FaRSA for ℓ1-regularized convex optimization: local convergence and numerical experience ⋮ Optimization Methods for Large-Scale Machine Learning ⋮ Inexact variable metric stochastic block-coordinate descent for regularized optimization ⋮ Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates ⋮ Inexact proximal Newton methods for self-concordant functions ⋮ Optimality of orders one to three and beyond: characterization and evaluation complexity in constrained nonconvex optimization ⋮ Second-order optimality and beyond: characterization and evaluation complexity in convexly constrained nonlinear optimization ⋮ Inexact proximal memoryless quasi-Newton methods based on the Broyden family for minimizing composite functions ⋮ Globalized inexact proximal Newton-type methods for nonconvex composite functions ⋮ An Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton Acceleration ⋮ Fused Multiple Graphical Lasso ⋮ One-Step Estimation with Scaled Proximal Methods
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Sparse inverse covariance estimation with the graphical lasso
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- An inexact successive quadratic approximation method for L-1 regularized optimization
- A family of second-order methods for convex \(\ell _1\)-regularized optimization
- An inexact interior point method for \(L_{1}\)-regularized sparse covariance selection
- A coordinate gradient descent method for nonsmooth separable minimization
- Representations of quasi-Newton matrices and their use in limited memory methods
- Introductory lectures on convex optimization. A basic course.
- Efficient block-coordinate descent algorithms for the group Lasso
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Cubic regularization of Newton method and its global performance
- Evaluation complexity of adaptive cubic regularization methods for convex unconstrained optimization
- Identifying Activity
- Sparse Reconstruction by Separable Approximation
- De-noising by soft-thresholding
- An Inexact Accelerated Proximal Gradient Method for Large Scale Linearly Constrained Convex SDP
This page was built for publication: Practical inexact proximal quasi-Newton method with global complexity analysis