Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates
From MaRDI portal
Publication:1744900
DOI10.1007/s10589-017-9964-zzbMath1397.90301arXiv1607.03081OpenAlexW2963573197MaRDI QIDQ1744900
Katya Scheinberg, Hiva Ghanbari
Publication date: 20 April 2018
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1607.03081
convergence ratesconvex composite optimizationstrong convexityaccelerated schemeproximal quasi-Newton methodsrandomized coordinate descent
Related Items
An inexact successive quadratic approximation method for a class of difference-of-convex optimization problems ⋮ Second order semi-smooth proximal Newton methods in Hilbert spaces ⋮ Inexact successive quadratic approximation for regularized optimization ⋮ Optimization by moving ridge functions: derivative-free optimization for computationally intensive functions ⋮ An acceleration of proximal diagonal Newton method ⋮ Minimizing oracle-structured composite functions ⋮ A Unified Adaptive Tensor Approximation Scheme to Accelerate Composite Convex Optimization ⋮ Inexact proximal stochastic second-order methods for nonconvex composite optimization ⋮ Inexact variable metric stochastic block-coordinate descent for regularized optimization ⋮ Globalized inexact proximal Newton-type methods for nonconvex composite functions ⋮ Stochastic proximal quasi-Newton methods for non-convex composite optimization ⋮ An accelerated first-order method with complexity analysis for solving cubic regularization subproblems ⋮ An Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton Acceleration ⋮ Scaled, Inexact, and Adaptive Generalized FISTA for Strongly Convex Optimization
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Smooth minimization of non-smooth functions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- An inexact successive quadratic approximation method for L-1 regularized optimization
- Practical inexact proximal quasi-Newton method with global complexity analysis
- Gradient methods for minimizing composite functions
- Fast first-order methods for composite convex optimization with backtracking
- Representations of quasi-Newton matrices and their use in limited memory methods
- Introductory lectures on convex optimization. A basic course.
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Accelerated and Inexact Forward-Backward Algorithms
- Proximal Newton-Type Methods for Minimizing Composite Functions
- An Inexact Accelerated Proximal Gradient Method for Large Scale Linearly Constrained Convex SDP
- Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods