Analysis of asymptotic escape of strict saddle sets in manifold optimization
From MaRDI portal
Publication:5037575
Abstract: In this paper, we provide some analysis on the asymptotic escape of strict saddles in manifold optimization using the projected gradient descent (PGD) algorithm. One of our main contributions is that we extend the current analysis to include non-isolated and possibly continuous saddle sets with complicated geometry. We prove that the PGD is able to escape strict critical submanifolds under certain conditions on the geometry and the distribution of the saddle point sets. We also show that the PGD may fail to escape strict saddles under weaker assumptions even if the saddle point set has zero measure and there is a uniform escape direction. We provide a counterexample to illustrate this important point. We apply this saddle analysis to the phase retrieval problem on the low-rank matrix manifold, prove that there are only a finite number of saddles, and they are strict saddles with high probability. We also show the potential application of our analysis for a broader range of manifold optimization problems.
Recommendations
- Escaping strict saddle points of the Moreau envelope in nonsmooth optimization
- On the Global Convergence of Randomized Coordinate Gradient Descent for Nonconvex Optimization
- Behavior of accelerated gradient methods near critical points of nonconvex functions
- Extending the Step-Size Restriction for Gradient Descent to Avoid Strict Saddle Points
- A Newton-based method for nonconvex optimization with fast evasion of saddle points
Cites work
- scientific article; zbMATH DE number 3980052 (Why is no real title available?)
- scientific article; zbMATH DE number 2200453 (Why is no real title available?)
- scientific article; zbMATH DE number 3238721 (Why is no real title available?)
- 3D Point Cloud Denoising Using Graph Laplacian Regularization of a Low Dimensional Manifold Model
- A fast hierarchically preconditioned eigensolver based on multiresolution matrix decomposition
- A geometric analysis of phase retrieval
- An Extrinsic Look at the Riemannian Hessian
- Blind deconvolution by a steepest descent algorithm on a quotient manifold
- Bose-Einstein condensation and superfluidity
- Complete Dictionary Recovery Over the Sphere I: Overview and the Geometric Picture
- Compressed modes for variational problems in mathematics and physics
- Convergence results for projected line-search methods on varieties of low-rank matrices via Łojasiewicz inequality
- Critical points of matrix least squares distance functions
- Data-driven tight frame construction and image denoising
- Diffuse Interface Models on Graphs for Classification of High Dimensional Data
- Dynamical Low‐Rank Approximation
- Efficient rank reduction of correlation matrices
- Empirical Arithmetic Averaging Over the Compact Stiefel Manifold
- Exact guarantees on the absence of spurious local minima for non-negative rank-1 robust principal component analysis
- Feature selection and multi-kernel learning for sparse representation on a manifold
- Gradient descent only converges to minimizers: non-isolated critical points and invariant regions
- Guarantees of Riemannian optimization for low rank matrix recovery
- Information-theoretic differential geometry of quantum phase transitions
- Low dimensional manifold model for image processing
- Low-rank matrix completion by Riemannian optimization
- Low-rank tensor completion by Riemannian optimization
- Morse-Bott homology
- Natural gradient via optimal transport
- Nonconvex Optimization Meets Low-Rank Matrix Factorization: An Overview
- On manifolds of tensors of fixed TT-rank
- On the Number of Solutions to Polynomial Systems of Equations
- Online learning in the embedded manifold of low-rank matrices
- Optimization Techniques on Riemannian Manifolds
- Optimization on the hierarchical Tucker manifold - applications to tensor completion
- Riemannian structure on manifolds of quantum states
- Sobolev gradient flow for the Gross-Pitaevskii eigenvalue problem: global convergence and computational efficiency
- Solving polynomial systems
- The Geometry of Algorithms with Orthogonality Constraints
- Time integration of tensor trains
- Toward the Optimal Construction of a Loss Function Without Spurious Local Minima for Solving Quadratic Equations
- Welcome to Riemannian computing in computer vision
This page was built for publication: Analysis of asymptotic escape of strict saddle sets in manifold optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5037575)