Pages that link to "Item:Q5219676"
From MaRDI portal
The following pages link to Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods (Q5219676):
Displaying 50 items.
- The restricted strong convexity revisited: analysis of equivalence to error bound and quadratic growth (Q523179) (← links)
- Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates (Q1744900) (← links)
- Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions (Q2020604) (← links)
- On the linear convergence of forward-backward splitting method. I: Convergence analysis (Q2031953) (← links)
- Growth conditions on a function and the error bound condition (Q2037710) (← links)
- The gradient projection algorithm for smooth sets and functions in nonconvex case (Q2045187) (← links)
- Level-set subdifferential error bounds and linear convergence of Bregman proximal gradient method (Q2046546) (← links)
- Low-rank matrix recovery with composite optimization: good conditioning and rapid convergence (Q2067681) (← links)
- Augmented Lagrangian method for second-order cone programs under second-order sufficiency (Q2070362) (← links)
- Variational analysis perspective on linear convergence of some first order methods for nonsmooth convex optimization problems (Q2070400) (← links)
- Inertial proximal incremental aggregated gradient method with linear convergence guarantees (Q2084299) (← links)
- Stochastic variance-reduced prox-linear algorithms for nonconvex composite optimization (Q2089785) (← links)
- Convergence rates for the heavy-ball continuous dynamics for non-convex optimization, under Polyak-Łojasiewicz condition (Q2089864) (← links)
- Sufficient conditions for a minimum of a strongly quasiconvex function on a weakly convex set (Q2113393) (← links)
- Linear convergence of prox-SVRG method for separable non-smooth convex optimization problems under bounded metric subregularity (Q2115253) (← links)
- Primal superlinear convergence of SQP methods in piecewise linear-quadratic composite optimization (Q2116019) (← links)
- Perturbation techniques for convergence analysis of proximal gradient method and other first-order algorithms via variational analysis (Q2116020) (← links)
- Convergence results of a new monotone inertial forward-backward splitting algorithm under the local Hölder error bound condition (Q2128612) (← links)
- Parallel random block-coordinate forward-backward algorithm: a unified convergence analysis (Q2133415) (← links)
- Proximal methods avoid active strict saddles of weakly convex functions (Q2143222) (← links)
- A stochastic extra-step quasi-Newton method for nonsmooth nonconvex optimization (Q2149551) (← links)
- Scaled relative graphs: nonexpansive operators via 2D Euclidean geometry (Q2149561) (← links)
- Global convergence of model function based Bregman proximal minimization algorithms (Q2154449) (← links)
- Stability of minimization problems and the error bound condition (Q2158835) (← links)
- Quadratic growth and strong metric subregularity of the subdifferential for a class of non-prox-regular functions (Q2159463) (← links)
- Kurdyka-Łojasiewicz exponent via inf-projection (Q2162122) (← links)
- The multiproximal linearization method for convex composite problems (Q2191762) (← links)
- On the gradient projection method for weakly convex functions on a proximally smooth set (Q2217258) (← links)
- Nonsmooth optimization using Taylor-like models: error bounds, convergence, and termination criteria (Q2220664) (← links)
- New analysis of linear convergence of gradient-type methods via unifying error bound conditions (Q2297652) (← links)
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate (Q2301128) (← links)
- Convergence rates of forward-Douglas-Rachford splitting method (Q2317846) (← links)
- Efficiency of minimizing compositions of convex functions and smooth maps (Q2330660) (← links)
- Linear convergence of first order methods for non-strongly convex optimization (Q2414900) (← links)
- Inexact successive quadratic approximation for regularized optimization (Q2419525) (← links)
- Non-smooth non-convex Bregman minimization: unification and new algorithms (Q2420780) (← links)
- Quadratic growth conditions and uniqueness of optimal solution to Lasso (Q2671439) (← links)
- The equivalence of three types of error bounds for weakly and approximately convex functions (Q2671442) (← links)
- A globally convergent proximal Newton-type method in nonsmooth convex optimization (Q2687066) (← links)
- Linearized proximal algorithms with adaptive stepsizes for convex composite optimization with applications (Q2694483) (← links)
- Error bound and isocost imply linear convergence of DCA-based algorithms to D-stationarity (Q2697002) (← links)
- The gradient projection algorithm for a proximally smooth set and a function with Lipschitz continuous gradient (Q3304387) (← links)
- Characterization of solutions of strong-weak convex programming problems (Q3382765) (← links)
- Strong Metric (Sub)regularity of Karush–Kuhn–Tucker Mappings for Piecewise Linear-Quadratic Convex-Composite Optimization and the Quadratic Convergence of Newton’s Method (Q3387919) (← links)
- RSG: Beating Subgradient Method without Smoothness and Strong Convexity (Q4558142) (← links)
- Stochastic Methods for Composite and Weakly Convex Optimization Problems (Q4561227) (← links)
- Stochastic Model-Based Minimization of Weakly Convex Functions (Q4620418) (← links)
- Global Convergence Rate of Proximal Incremental Aggregated Gradient Methods (Q4641660) (← links)
- A linearly convergent majorized ADMM with indefinite proximal terms for convex composite programming and its applications (Q4960078) (← links)
- On the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate Ascent (Q4969070) (← links)