Pages that link to "Item:Q2205972"
From MaRDI portal
The following pages link to Lower bounds for finding stationary points I (Q2205972):
Displaying 24 items.
- Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions (Q2026726) (← links)
- Some worst-case datasets of deterministic first-order methods for solving binary logistic regression (Q2028922) (← links)
- Adaptive regularization with cubics on manifolds (Q2039233) (← links)
- An adaptive high order method for finding third-order critical points of nonconvex optimization (Q2079692) (← links)
- Perturbed iterate SGD for Lipschitz continuous loss functions (Q2093279) (← links)
- A hybrid stochastic optimization framework for composite nonconvex optimization (Q2118109) (← links)
- Mathematical foundations of machine learning. Abstracts from the workshop held March 21--27, 2021 (hybrid meeting) (Q2131208) (← links)
- A cubic regularization of Newton's method with finite difference Hessian approximations (Q2138398) (← links)
- On lower iteration complexity bounds for the convex concave saddle point problems (Q2149573) (← links)
- Lower complexity bounds of first-order methods for convex-concave bilinear saddle-point problems (Q2220653) (← links)
- Lower bounds for finding stationary points II: first-order methods (Q2220663) (← links)
- Implementable tensor methods in unconstrained convex optimization (Q2227532) (← links)
- The exact worst-case convergence rate of the gradient method with fixed step lengths for \(L\)-smooth functions (Q2673524) (← links)
- Near-Optimal Hyperfast Second-Order Method for Convex Optimization (Q4965110) (← links)
- Potential Function-Based Framework for Minimizing Gradients in Convex and Min-Max Optimization (Q5093649) (← links)
- Efficient Search of First-Order Nash Equilibria in Nonconvex-Concave Smooth Min-Max Problems (Q5158768) (← links)
- Convergence guarantees for a class of non-convex and non-smooth optimization problems (Q5214248) (← links)
- Generalized Momentum-Based Methods: A Hamiltonian Perspective (Q5857293) (← links)
- A Global Dual Error Bound and Its Application to the Analysis of Linearly Constrained Nonconvex Optimization (Q5869816) (← links)
- Lower bounds for non-convex stochastic optimization (Q6038643) (← links)
- An accelerated first-order method for non-convex optimization on manifolds (Q6048700) (← links)
- Conditions for linear convergence of the gradient method for non-convex optimization (Q6097482) (← links)
- A Newton-CG Based Barrier Method for Finding a Second-Order Stationary Point of Nonconvex Conic Optimization with Complexity Guarantees (Q6114780) (← links)
- Hyperfast second-order local solvers for efficient statistically preconditioned distributed optimization (Q6114954) (← links)