A quasi-Newton algorithm for nonconvex, nonsmooth optimization with global convergence guarantees (Q903922)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | A quasi-Newton algorithm for nonconvex, nonsmooth optimization with global convergence guarantees |
scientific article |
Statements
A quasi-Newton algorithm for nonconvex, nonsmooth optimization with global convergence guarantees (English)
0 references
15 January 2016
0 references
The authors propose a new algorithm for minimizing a locally Lipschitz function which is continuously differentiable in an open dense subset of a real vector space. The algorithm requires only the first derivative of the cost function to be optimized and is based on the method initiated previously by Broyden, Fletcher, Goldfarb and Shanno. The algorithm is first described, then the authors proceed to a randomized analysis of its global convergence and conclude with the implementation of the method together with numerical experiments to illustrate the matter. The algorithm is supported mainly by line search technique.
0 references
quasi-Newton methods
0 references
nonsmooth optimization
0 references
nonconvex optimization
0 references
unconstrained optimization
0 references
gradient sampling
0 references
line search methods
0 references
0 references
0 references
0 references
0 references
0 references