Pages that link to "Item:Q2425175"
From MaRDI portal
The following pages link to First-order methods almost always avoid strict saddle points (Q2425175):
Displaying 34 items.
- The global optimization geometry of shallow linear neural networks (Q1988338) (← links)
- Regional complexity analysis of algorithms for nonconvex smooth optimization (Q2020615) (← links)
- Error bound of critical points and KL property of exponent 1/2 for squared F-norm regularized factorization (Q2052408) (← links)
- On the geometric analysis of a quartic-quadratic optimization problem under a spherical constraint (Q2089778) (← links)
- Triangularized orthogonalization-free method for solving extreme eigenvalue problems (Q2103436) (← links)
- On initial point selection of the steepest descent algorithm for general quadratic functions (Q2141353) (← links)
- Proximal methods avoid active strict saddles of weakly convex functions (Q2143222) (← links)
- Landscape analysis for shallow neural networks: complete classification of critical points for affine target functions (Q2156337) (← links)
- A proof of convergence for stochastic gradient descent in the training of artificial neural networks with ReLU activation for constant target functions (Q2167333) (← links)
- Stochastic optimization with momentum: convergence, fluctuations, and traps avoidance (Q2233558) (← links)
- An envelope for Davis-Yin splitting and strict saddle-point avoidance (Q2420798) (← links)
- Exploiting negative curvature in deterministic and stochastic optimization (Q2425164) (← links)
- Consensus-based optimization on hypersurfaces: Well-posedness and mean-field limit (Q3388781) (← links)
- (Q4969167) (← links)
- (Q4998970) (← links)
- Computing Symplectic Eigenpairs of Symmetric Positive-Definite Matrices via Trace Minimization and Riemannian Optimization (Q5021025) (← links)
- Extending the Step-Size Restriction for Gradient Descent to Avoid Strict Saddle Points (Q5027014) (← links)
- (Q5053253) (← links)
- Fast Cluster Detection in Networks by First Order Optimization (Q5065475) (← links)
- Column $\ell_{2,0}$-Norm Regularized Factorization Model of Low-Rank Matrix Recovery and Its Computation (Q5081098) (← links)
- Escaping Strict Saddle Points of the Moreau Envelope in Nonsmooth Optimization (Q5097019) (← links)
- Second-Order Guarantees of Distributed Gradient Algorithms (Q5131964) (← links)
- Model-free Nonconvex Matrix Completion: Local Minima Analysis and Applications in Memory-efficient Kernel PCA (Q5214234) (← links)
- CoordinateWise Descent Methods for Leading Eigenvalue Problem (Q5230665) (← links)
- First-order Methods for the Impatient: Support Identification in Finite Time with Convergent Frank--Wolfe Variants (Q5233104) (← links)
- (Q5381139) (← links)
- A Bregman Forward-Backward Linesearch Algorithm for Nonconvex Composite Optimization: Superlinear Convergence to Nonisolated Local Minima (Q5853567) (← links)
- Linear convergence of an alternating polar decomposition method for low rank orthogonal tensor approximations (Q6038672) (← links)
- Global convergence of the gradient method for functions definable in o-minimal structures (Q6052062) (← links)
- Convergence of the Momentum Method for Semialgebraic Functions with Locally Lipschitz Gradients (Q6071885) (← links)
- A geometric approach of gradient descent algorithms in linear neural networks (Q6099180) (← links)
- On the Global Convergence of Randomized Coordinate Gradient Descent for Nonconvex Optimization (Q6158001) (← links)
- Provable Phase Retrieval with Mirror Descent (Q6168333) (← links)
- Weighted Trace-Penalty Minimization for Full Configuration Interaction (Q6189167) (← links)