Pages that link to "Item:Q4571877"
From MaRDI portal
The following pages link to Accelerated Methods for NonConvex Optimization (Q4571877):
Displaying 50 items.
- Augmented Lagrangians with constrained subproblems and convergence to second-order stationary points (Q683332) (← links)
- The global optimization geometry of shallow linear neural networks (Q1988338) (← links)
- Inertial proximal gradient methods with Bregman regularization for a class of nonconvex optimization problems (Q2022322) (← links)
- An accelerated first-order method with complexity analysis for solving cubic regularization subproblems (Q2044484) (← links)
- A FISTA-type accelerated gradient algorithm for solving smooth nonconvex composite optimization problems (Q2044494) (← links)
- An adaptive high order method for finding third-order critical points of nonconvex optimization (Q2079692) (← links)
- Accelerated inexact composite gradient methods for nonconvex spectral optimization problems (Q2149955) (← links)
- A regularization interpretation of the proximal point method for weakly convex functions (Q2179443) (← links)
- An efficient adaptive accelerated inexact proximal point method for solving linearly constrained nonconvex composite problems (Q2181594) (← links)
- Lower bounds for finding stationary points I (Q2205972) (← links)
- Optimization for deep learning: an overview (Q2218095) (← links)
- Lower bounds for finding stationary points II: first-order methods (Q2220663) (← links)
- A decoupled first/second-order steps technique for nonconvex nonlinear unconstrained optimization with improved complexity bounds (Q2288191) (← links)
- A Newton-CG algorithm with complexity guarantees for smooth unconstrained optimization (Q2297654) (← links)
- Combining stochastic adaptive cubic regularization with negative curvature for nonconvex optimization (Q2302838) (← links)
- Provable accelerated gradient method for nonconvex low rank optimization (Q2303662) (← links)
- Optimality condition and complexity analysis for linearly-constrained optimization without differentiability on the boundary (Q2330649) (← links)
- Cubic regularization methods with second-order complexity guarantee based on a new subproblem reformulation (Q2676160) (← links)
- Inexact accelerated high-order proximal-point methods (Q2689812) (← links)
- Zeroth-order nonconvex stochastic optimization: handling constraints, high dimensionality, and saddle points (Q2696568) (← links)
- Adaptive Quadratically Regularized Newton Method for Riemannian Optimization (Q3176355) (← links)
- Accelerating Proximal Markov Chain Monte Carlo by Using an Explicit Stabilized Method (Q3296473) (← links)
- (Q4558559) (← links)
- Accelerated Methods for NonConvex Optimization (Q4571877) (← links)
- A Newton-Based Method for Nonconvex Optimization with Fast Evasion of Saddle Points (Q4620423) (← links)
- Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy (Q4629334) (← links)
- Complexity Analysis of Second-Order Line-Search Algorithms for Smooth Nonconvex Optimization (Q4641667) (← links)
- Lower Bounds for Parallel and Randomized Convex Optimization (Q4969036) (← links)
- (Q4969117) (← links)
- (Q4969167) (← links)
- Inexact High-Order Proximal-Point Methods with Auxiliary Search Procedure (Q5013580) (← links)
- Extending the Step-Size Restriction for Gradient Descent to Avoid Strict Saddle Points (Q5027014) (← links)
- Global Convergence of Stochastic Gradient Hamiltonian Monte Carlo for Nonconvex Stochastic Optimization: Nonasymptotic Performance Bounds and Momentum-Based Acceleration (Q5058053) (← links)
- Running Primal-Dual Gradient Method for Time-Varying Nonconvex Problems (Q5093264) (← links)
- Escaping Strict Saddle Points of the Moreau Envelope in Nonsmooth Optimization (Q5097019) (← links)
- First-Order Methods for Nonconvex Quadratic Minimization (Q5113167) (← links)
- Second-Order Guarantees of Distributed Gradient Algorithms (Q5131964) (← links)
- An Average Curvature Accelerated Composite Gradient Method for Nonconvex Smooth Composite Optimization Problems (Q5147027) (← links)
- Solving Large-Scale Cubic Regularization by a Generalized Eigenvalue Problem (Q5148399) (← links)
- Convergence of Newton-MR under Inexact Hessian Information (Q5148404) (← links)
- (Q5149016) (← links)
- An Accelerated Inexact Proximal Point Method for Solving Nonconvex-Concave Min-Max Problems (Q5162651) (← links)
- (Q5214226) (← links)
- Sharp Worst-Case Evaluation Complexity Bounds for Arbitrary-Order Nonconvex Optimization with Inexpensive Constraints (Q5217594) (← links)
- Gradient Descent Finds the Cubic-Regularized Nonconvex Newton Step (Q5233102) (← links)
- Complexity of a Quadratic Penalty Accelerated Inexact Proximal Point Method for Solving Linearly Constrained Nonconvex Composite Programs (Q5237309) (← links)
- Accelerated Stochastic Algorithms for Nonconvex Finite-Sum and Multiblock Optimization (Q5242931) (← links)
- Trust-Region Newton-CG with Strong Second-Order Complexity Guarantees for Nonconvex Optimization (Q5853562) (← links)
- Stochastic proximal linear method for structured non-convex problems (Q5858986) (← links)
- Higher-Order Methods for Convex-Concave Min-Max Optimization and Monotone Variational Inequalities (Q5869812) (← links)