Lower bounds for finding stationary points I
DOI10.1007/s10107-019-01406-yzbMath1451.90128arXiv1710.11606OpenAlexW2964000362MaRDI QIDQ2205972
Yair Carmon, Oliver Hinder, Aaron Sidford, John C. Duchi
Publication date: 21 October 2020
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1710.11606
non-convex optimizationinformation-based complexitygradient descentcubic regularization of Newton's methoddimension-free rates
Analysis of algorithms and problem complexity (68Q25) Large-scale problems in mathematical programming (90C06) Abstract computational complexity for mathematical programming problems (90C60) Nonconvex programming, global optimization (90C26)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- Complexity bounds for second-order optimality in unconstrained optimization
- High-dimensional regression with noisy and missing data: provable guarantees with nonconvexity
- On the limited memory BFGS method for large scale optimization
- A nonlinear programming algorithm for solving semidefinite programs via low-rank factorization
- Introductory lectures on convex optimization. A basic course.
- A geometric analysis of phase retrieval
- Oracle complexity of second-order methods for smooth convex optimization
- Cubic regularization of Newton method and its global performance
- On Lower and Upper Bounds for Smooth and Strongly Convex Optimization Problems
- An Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and Its Implications to Second-Order Methods
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Phase Retrieval via Wirtinger Flow: Theory and Algorithms
- On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems
- Some NP-complete problems in quadratic and nonlinear programming
- Accelerated Methods for NonConvex Optimization
- Black-Box Complexity of Local Minimization
- The Best Rank-1 Approximation of a Symmetric Tensor and Related Spherical Optimization Problems
- Finding approximate local minima faster than gradient descent
- WORST-CASE EVALUATION COMPLEXITY AND OPTIMALITY OF SECOND-ORDER METHODS FOR NONCONVEX SMOOTH OPTIMIZATION
- Information-Theoretic Lower Bounds on the Oracle Complexity of Stochastic Convex Optimization
- A note about the complexity of minimizing Nesterov's smooth Chebyshev–Rosenbrock function
- On Nesterov's smooth Chebyshev–Rosenbrock function
- Lower Bounds on the Oracle Complexity of Nonsmooth Convex Optimization via Information Theory
- Regularized M-estimators with nonconvexity: Statistical and algorithmic theory for local optima
- On Recursions Connected With Symmetric Groups I
This page was built for publication: Lower bounds for finding stationary points I