Pages that link to "Item:Q1675251"
From MaRDI portal
The following pages link to From error bounds to the complexity of first-order descent methods for convex functions (Q1675251):
Displaying 50 items.
- Perturbation of error bounds (Q2413098) (← links)
- Quadratic growth conditions and uniqueness of optimal solution to Lasso (Q2671439) (← links)
- The equivalence of three types of error bounds for weakly and approximately convex functions (Q2671442) (← links)
- Optimal non-asymptotic analysis of the Ruppert-Polyak averaging stochastic algorithm (Q2680399) (← links)
- Convergence rates of the heavy-ball method under the Łojasiewicz property (Q2687044) (← links)
- Convergence of the forward-backward algorithm: beyond the worst-case with the help of geometry (Q2687067) (← links)
- Revisiting the approximate Carathéodory problem via the Frank-Wolfe algorithm (Q2689818) (← links)
- Variance reduction for root-finding problems (Q2689823) (← links)
- Optimal convergence rates for damped inertial gradient dynamics with flat geometries (Q2694484) (← links)
- A simple nearly optimal restart scheme for speeding up first-order methods (Q2696573) (← links)
- General Hölder smooth convergence rates follow from specialized rates assuming growth bounds (Q2696991) (← links)
- Local Minimizers of Semi-Algebraic Functions from the Viewpoint of Tangencies (Q3300766) (← links)
- Convergence Rates of Damped Inertial Dynamics under Geometric Conditions and Perturbations (Q3300770) (← links)
- The gradient projection algorithm for a proximally smooth set and a function with Lipschitz continuous gradient (Q3304387) (← links)
- RSG: Beating Subgradient Method without Smoothness and Strong Convexity (Q4558142) (← links)
- Random Function Iterations for Consistent Stochastic Feasibility (Q4632357) (← links)
- On the Quadratic Convergence of the Cubic Regularization Method under a Local Error Bound Condition (Q4634142) (← links)
- On the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate Ascent (Q4969070) (← links)
- Computational approaches to non-convex, sparsity-inducing multi-penalty regularization (Q4989924) (← links)
- Proximal-Like Incremental Aggregated Gradient Method with Linear Convergence Under Bregman Distance Growth Conditions (Q4991666) (← links)
- Convergence Rate Analysis of a Sequential Convex Programming Method with Line Search for a Class of Constrained Difference-of-Convex Optimization Problems (Q5010048) (← links)
- Screening Rules and its Complexity for Active Set Identification (Q5026418) (← links)
- First-Order Algorithms for a Class of Fractional Optimization Problems (Q5026841) (← links)
- Proximal Gradient Methods for Machine Learning and Imaging (Q5028165) (← links)
- Tensor Canonical Correlation Analysis With Convergence and Statistical Guarantees (Q5066457) (← links)
- Constant step stochastic approximations involving differential inclusions: stability, long-run convergence and applications (Q5086426) (← links)
- Multiple-sets split quasi-convex feasibility problems: Adaptive subgradient methods with convergence guarantee (Q5088832) (← links)
- Thresholding gradient methods in Hilbert spaces: support identification and linear convergence (Q5109200) (← links)
- Sample Complexity of Sample Average Approximation for Conditional Stochastic Optimization (Q5116551) (← links)
- Active Set Complexity of the Away-Step Frank--Wolfe Algorithm (Q5124005) (← links)
- Optimal Convergence Rates for Nesterov Acceleration (Q5206941) (← links)
- Sharpness, Restart, and Acceleration (Q5210521) (← links)
- Convergence Rates for Deterministic and Stochastic Subgradient Methods without Lipschitz Continuity (Q5231668) (← links)
- Novel Reformulations and Efficient Algorithms for the Generalized Trust Region Subproblem (Q5231678) (← links)
- Accelerate stochastic subgradient method by leveraging local growth condition (Q5236746) (← links)
- A Bregman Forward-Backward Linesearch Algorithm for Nonconvex Composite Optimization: Superlinear Convergence to Nonisolated Local Minima (Q5853567) (← links)
- Deep Neural Networks for Inverse Problems with Pseudodifferential Operators: An Application to Limited-Angle Tomography (Q5860291) (← links)
- The Exact Modulus of the Generalized Concave Kurdyka-Łojasiewicz Property (Q5870355) (← links)
- Hölderian Error Bounds and Kurdyka-Łojasiewicz Inequality for the Trust Region Subproblem (Q5870365) (← links)
- Newton acceleration on manifolds identified by proximal gradient methods (Q6044974) (← links)
- Error bounds, facial residual functions and applications to the exponential cone (Q6044980) (← links)
- Fast convergence of inertial dynamics with Hessian-driven damping under geometry assumptions (Q6058513) (← links)
- Nonlocal error bounds for piecewise affine functions (Q6080421) (← links)
- A speed restart scheme for a dynamics with Hessian-driven damping (Q6086152) (← links)
- FISTA is an automatic geometrically optimized algorithm for strongly convex functions (Q6120847) (← links)
- Radial duality. II: Applications and algorithms (Q6126645) (← links)
- Differentiating Nonsmooth Solutions to Parametric Monotone Inclusion Problems (Q6136656) (← links)
- On the relationship between the Kurdyka-Łojasiewicz property and error bounds on Hadamard manifolds (Q6151598) (← links)
- Linear Convergence of a Proximal Alternating Minimization Method with Extrapolation for \(\boldsymbol{\ell_1}\) -Norm Principal Component Analysis (Q6158000) (← links)
- Accelerating inexact successive quadratic approximation for regularized optimization through manifold identification (Q6165598) (← links)