Global rates of convergence for nonconvex optimization on manifolds
From MaRDI portal
Publication:5113326
DOI10.1093/IMANUM/DRX080zbMATH Open1483.65092arXiv1605.08101OpenAlexW2402588523MaRDI QIDQ5113326FDOQ5113326
Nicolas Boumal, P.-A. Absil, Coralia Cartis
Publication date: 4 June 2020
Published in: IMA Journal of Numerical Analysis (Search for Journal in Brave)
Abstract: We consider the minimization of a cost function on a manifold using Riemannian gradient descent and Riemannian trust regions (RTR). We focus on satisfying necessary optimality conditions within a tolerance . Specifically, we show that, under Lipschitz-type assumptions on the pullbacks of to the tangent spaces of , both of these algorithms produce points with Riemannian gradient smaller than in iterations. Furthermore, RTR returns a point where also the Riemannian Hessian's least eigenvalue is larger than in iterations. There are no assumptions on initialization. The rates match their (sharp) unconstrained counterparts as a function of the accuracy (up to constants) and hence are sharp in that sense. These are the first deterministic results for global rates of convergence to approximate first- and second-order Karush-Kuhn-Tucker points on manifolds. They apply in particular for optimization constrained to compact submanifolds of , under simpler assumptions.
Full work available at URL: https://arxiv.org/abs/1605.08101
Numerical mathematical programming methods (65K05) Optimality conditions and duality in mathematical programming (90C46) Nonconvex programming, global optimization (90C26)
Cited In (63)
- Parameter-free accelerated gradient descent for nonconvex minimization
- Riemannian preconditioned coordinate descent for low multilinear rank approximation
- A Riemannian dimension-reduced second-order method with application in sensor network localization
- A feasible method for general convex low-rank SDP problems
- Computing second-order points under equality constraints: revisiting Fletcher's augmented Lagrangian
- Smoothing algorithms for nonsmooth optimization over the Stiefel manifold with applications to the graph Fourier basis problem
- New vector transport operators extending a Riemannian CG algorithm to generalized Stiefel manifold with low-rank applications
- Convergence and worst-case complexity of adaptive Riemannian trust-region methods for optimization on manifolds
- Trace Lasso regularization for adaptive sparse canonical correlation analysis via manifold optimization approach
- Sparse additive function decompositions facing basis transforms
- A Riemannian Proximal Newton Method
- Retraction-based direct search methods for derivative free Riemannian optimization
- Nonsmooth optimization over the Stiefel manifold and beyond: proximal gradient method and recent variants
- An inexact Riemannian proximal gradient method
- An accelerated first-order method for non-convex optimization on manifolds
- Riemannian Optimization on the Symplectic Stiefel Manifold
- Nonlinear matrix recovery using optimization on the Grassmann manifold
- Riemannian proximal gradient methods
- Mini-workshop: Computational optimization on manifolds. Abstracts from the mini-workshop held November 15--21, 2020 (online meeting)
- Rank Optimality for the Burer--Monteiro Factorization
- Multivariate expectile-based distribution: properties, Bayesian inference, and applications
- Gradient Method for Optimization on Riemannian Manifolds with Lower Bounded Curvature
- On the optimization landscape of tensor decompositions
- Proximal quasi-Newton method for composite optimization over the Stiefel manifold
- New Riemannian Preconditioned Algorithms for Tensor Completion via Polyadic Decomposition
- First Order Methods for Optimization on Riemannian Manifolds
- An exact penalty approach for optimization with nonnegative orthogonality constraints
- An active-set proximal quasi-Newton algorithm for β1-regularized minimization over a sphere constraint
- Simple algorithms for optimization on Riemannian manifolds with constraints
- Primal-dual optimization algorithms over Riemannian manifolds: an iteration complexity analysis
- Non-convex exact community recovery in stochastic block model
- Proximal Gradient Method for Nonsmooth Optimization over the Stiefel Manifold
- Weakly Convex Optimization over Stiefel Manifold Using Riemannian Subgradient-Type Methods
- A brief introduction to manifold optimization
- Riemannian gradient descent methods for graph-regularized matrix completion
- Riemannian smoothing gradient type algorithms for nonsmooth optimization problem on compact Riemannian submanifold embedded in Euclidean space
- An adaptive Riemannian gradient method without function evaluations
- Adaptive regularization with cubics on manifolds
- Approximate Matrix and Tensor Diagonalization by Unitary Transformations: Convergence of Jacobi-Type Algorithms
- Structured Quasi-Newton Methods for Optimization with Orthogonality Constraints
- Memory-Efficient Structured Convex Optimization via Extreme Point Sampling
- Memoryless quasi-Newton methods based on the spectral-scaling Broyden family for Riemannian optimization
- A Riemannian optimization approach to the radial distribution network load flow problem
- Sequential Quadratic Optimization for Nonlinear Optimization Problems on Riemannian Manifolds
- Nonconvex phase synchronization
- Title not available (Why is that?)
- Globally convergent optimization algorithms on Riemannian manifolds: Uniform framework for unconstrained and constrained optimization
- A Decomposition Augmented Lagrangian Method for Low-Rank Semidefinite Programming
- Geometric Methods on Low-Rank Matrix and Tensor Manifolds
- Riemannian Hamiltonian Methods for Min-Max Optimization on Manifolds
- Riemannian stochastic variance-reduced cubic regularized Newton method for submanifold optimization
- A Riemannian rank-adaptive method for low-rank matrix completion
- Quotient Geometry with Simple Geodesics for the Manifold of Fixed-Rank Positive-Semidefinite Matrices
- An alternative to EM for Gaussian mixture models: batch and stochastic Riemannian optimization
- Cholesky QR-based retraction on the generalized Stiefel manifold
- Riemannian optimization with a preconditioning scheme on the generalized Stiefel manifold
- Finding stationary points on bounded-rank matrices: a geometric hurdle and a smooth remedy
- Recent Advances in Stochastic Riemannian Optimization
- On the quality of first-order approximation of functions with HΓΆlder continuous gradient
- Tracking and Regret Bounds for Online Zeroth-Order Euclidean and Riemannian Optimization
- Sharp Restricted Isometry Bounds for the Inexistence of Spurious Local Minima in Nonconvex Matrix Recovery
- On the Simplicity and Conditioning of Low Rank Semidefinite Programs
- Faster Riemannian Newton-type optimization by subsampling and cubic regularization
Recommendations
- Erratum to: ``Global rates of convergence for nonconvex optimization on manifolds π π
- Globally convergent optimization algorithms on Riemannian manifolds: Uniform framework for unconstrained and constrained optimization π π
- Rate of convergence for proximal point algorithms on Hadamard manifolds π π
- Convergence rates of gradient methods for convex optimization in the space of measures π π
- Rates of Convergence for a Class of Global Stochastic Optimization Algorithms π π
- Global convergence of nonmonotone descent methods for unconstrained optimization problems π π
- Global convergence of ADMM in nonconvex nonsmooth optimization π π
- Global convergence of a curvilinear search for non-convex optimization π π
- On the Global Convergence of Randomized Coordinate Gradient Descent for Nonconvex Optimization π π
- Global optimality conditions for nonconvex optimization π π
This page was built for publication: Global rates of convergence for nonconvex optimization on manifolds
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5113326)