Generalized uniformly optimal methods for nonlinear programming
From MaRDI portal
Abstract: In this paper, we present a generic framework to extend existing uniformly optimal convex programming algorithms to solve more general nonlinear, possibly nonconvex, optimization problems. The basic idea is to incorporate a local search step (gradient descent or Quasi-Newton iteration) into these uniformly optimal convex programming methods, and then enforce a monotone decreasing property of the function values computed along the trajectory. Algorithms of these types will then achieve the best known complexity for nonconvex problems, and the optimal complexity for convex ones without requiring any problem parameters. As a consequence, we can have a unified treatment for a general class of nonlinear programming problems regardless of their convexity and smoothness level. In particular, we show that the accelerated gradient and level methods, both originally designed for solving convex optimization problems only, can be used for solving both convex and nonconvex problems uniformly. In a similar vein, we show that some well-studied techniques for nonlinear programming, e.g., Quasi-Newton iteration, can be embedded into optimal convex optimization algorithms to possibly further enhance their numerical performance. Our theoretical and algorithmic developments are complemented by some promising numerical results obtained for solving a few important nonconvex and nonlinear data analysis problems in the literature.
Recommendations
- Accelerated gradient methods for nonconvex nonlinear and stochastic programming
- Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization
- A universal modification of the linear coupling method
- Gradient methods for minimizing composite functions
- Universal gradient methods for convex optimization problems
Cites work
- scientific article; zbMATH DE number 3790208 (Why is no real title available?)
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Accelerated gradient methods for nonconvex nonlinear and stochastic programming
- An optimal method for stochastic composite optimization
- Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization
- Fast bundle-level methods for unconstrained and ball-constrained convex optimization
- First-order methods of smooth convex optimization with inexact oracle
- Gradient methods for minimizing composite functions
- Introductory lectures on convex optimization. A basic course.
- Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization
- New variants of bundle methods
- Non-Euclidean restricted memory level method for large-scale convex optimization
- Numerical Optimization
- On the complexity of steepest descent, Newton's and regularized Newton's methods for nonconvex unconstrained optimization problems
- On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients
- Online learning for matrix factorization and sparse coding
- Optimal stochastic approximation algorithms for strongly convex stochastic composite optimization. II: Shrinking procedures and optimal algorithms
- Optimization for simulation: theory vs. practice
- Projection onto a polyhedron that exploits sparsity
- Representations of quasi-Newton matrices and their use in limited memory methods
- Stochastic simulation: Algorithms and analysis
- Universal gradient methods for convex optimization problems
- Updating Quasi-Newton Matrices with Limited Storage
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
Cited in
(25)- Accelerated Stochastic Algorithms for Nonconvex Finite-Sum and Multiblock Optimization
- scientific article; zbMATH DE number 5735028 (Why is no real title available?)
- Accelerated gradient methods for nonconvex nonlinear and stochastic programming
- Generalized pattern search methods for a class of nonsmooth optimization problems with structure
- First-order methods for convex optimization
- Universal intermediate gradient method for convex problems with inexact oracle
- Average curvature FISTA for nonconvex smooth composite optimization problems
- Complexity-optimal and parameter-free first-order methods for finding stationary points of composite optimization problems
- A unified convergence analysis of stochastic Bregman proximal gradient and extragradient methods
- Primal-dual accelerated gradient methods with small-dimensional relaxation oracle
- Accelerated inexact composite gradient methods for nonconvex spectral optimization problems
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity
- Unification of basic and composite nondifferentiable optimization
- Inexact proximal stochastic second-order methods for nonconvex composite optimization
- An average curvature accelerated composite gradient method for nonconvex smooth composite optimization problems
- A FISTA-type accelerated gradient algorithm for solving smooth nonconvex composite optimization problems
- Numerical representations of a universal subspace flow for linear programs
- Generalized mirror prox algorithm for monotone variational inequalities: Universality and inexact oracle
- Triality theory for general unconstrained global optimization problems
- An efficient adaptive accelerated inexact proximal point method for solving linearly constrained nonconvex composite problems
- Generalized momentum-based methods: a Hamiltonian perspective
- An adaptive high order method for finding third-order critical points of nonconvex optimization
- A new restricted memory level bundle method for constrained convex nonsmooth optimization
- Recent Theoretical Advances in Non-Convex Optimization
- Conditional gradient type methods for composite nonlinear and stochastic optimization
This page was built for publication: Generalized uniformly optimal methods for nonlinear programming
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2316202)