Error bound conditions and convergence of optimization methods on smooth and proximally smooth manifolds
DOI10.1080/02331934.2020.1812066zbMath1490.90224arXiv1912.04660OpenAlexW3083877392MaRDI QIDQ5070619
A. A. Tremba, Maxim V. Balashov
Publication date: 13 April 2022
Published in: Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1912.04660
Newton's methodnonconvex optimizationgradient projection algorithmproximal smoothnesserror bound condition
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Numerical optimization and variational techniques (65K10) Applications of functional analysis in optimization, convex analysis, mathematical programming, economics (46N10)
Related Items (2)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Lectures on convex optimization
- An extension of the gradient projection method and Newton's method to extremum problems constrained by a smooth surface
- Stepsize analysis for descent methods
- New error bounds and their applications to convergence analysis of iterative algorithms
- Minimizing a Quadratic Over a Sphere
- Evaluation Complexity for Nonlinear Constrained Optimization Using Unscaled KKT Conditions and High-Order Models
- On the Evaluation Complexity of Cubic Regularization Methods for Potentially Rank-Deficient Nonlinear Least-Squares Problems and Its Relevance to Constrained Nonlinear Optimization
- Projection-like Retractions on Matrix Manifolds
- Convergence Results for Projected Line-Search Methods on Varieties of Low-Rank Matrices Via Łojasiewicz Inequality
- Strong and Weak Convexity of Sets and Functions
- A Globally Convergent Augmented Lagrangian Algorithm for Optimization with General Constraints and Simple Bounds
- The gradient projection algorithm for a proximally smooth set and a function with Lipschitz continuous gradient
- Characterizations of Łojasiewicz inequalities: Subgradient flows, talweg, convexity
- The Geometry of Algorithms with Orthogonality Constraints
- Local differentiability of distance functions
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods
- Gradient Projection and Conditional Gradient Methods for Constrained Nonconvex Minimization
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- The Gradient Projection Method Along Geodesics
- New versions of Newton method: step-size choice, convergence domain and under-determined equations
- On various notions of regularity of sets in nonsmooth analysis
This page was built for publication: Error bound conditions and convergence of optimization methods on smooth and proximally smooth manifolds