Exploiting negative curvature in deterministic and stochastic optimization
DOI10.1007/S10107-018-1335-8zbMATH Open1417.49036arXiv1703.00412OpenAlexW2963321060WikidataQ129134954 ScholiaQ129134954MaRDI QIDQ2425164FDOQ2425164
Frank E. Curtis, Daniel P. Robinson
Publication date: 26 June 2019
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1703.00412
machine learningnonconvex optimizationstochastic optimizationsecond-order methodsnegative curvaturemodified Newton methods
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37) Stochastic programming (90C15) Methods of quasi-Newton type (90C53) Newton-type methods (49M15) Numerical methods based on necessary conditions (49M05)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- Exploiting negative curvature directions in linesearch methods for unconstrained optimization
- A stabilized SQP method: global convergence
- A stabilized SQP method: superlinear convergence
- Curvilinear path steplength algorithms for minimization which use directions of negative curvature
- On the use of directions of negative curvature in a modified newton method
- Complexity Analysis of Second-Order Line-Search Algorithms for Smooth Nonconvex Optimization
- Computing Modified Newton Directions Using a Partial Cholesky Factorization
- Optimization Methods for Large-Scale Machine Learning
- A Solver for Nonconvex Bound-Constrained Quadratic Optimization
- A nonconvex formulation for low rank subspace clustering: algorithms and convergence analysis
- First-order methods almost always avoid strict saddle points
- A Newton-Based Method for Nonconvex Optimization with Fast Evasion of Saddle Points
Cited In (14)
- The global optimization geometry of shallow linear neural networks
- Fully stochastic trust-region sequential quadratic programming for equality-constrained optimization problems
- Sequential Quadratic Optimization for Nonlinear Equality Constrained Stochastic Optimization
- Complexity analysis of interior-point methods for second-order stationary points of nonlinear semidefinite optimization problems
- Regional complexity analysis of algorithms for nonconvex smooth optimization
- Iterative grossone-based computation of negative curvature directions in large-scale optimization
- Polarity and conjugacy for quadratic hypersurfaces: a unified framework with recent advances
- STOCHASTIC OPTIMIZATION OF MULTIPLICATIVE FUNCTIONS WITH NEGATIVE VALUE
- A deterministic gradient-based approach to avoid saddle points
- Combining stochastic adaptive cubic regularization with negative curvature for nonconvex optimization
- A fully stochastic second-order trust region method
- A Subspace Acceleration Method for Minimization Involving a Group Sparsity-Inducing Regularizer
- Open Problem—Iterative Schemes for Stochastic Optimization: Convergence Statements and Limit Theorems
- Worst-case complexity of an SQP method for nonlinear equality constrained stochastic optimization
Uses Software
This page was built for publication: Exploiting negative curvature in deterministic and stochastic optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2425164)