A nonmonotone approximate sequence algorithm for unconstrained nonlinear optimization
From MaRDI portal
Publication:2436686
DOI10.1007/s10589-013-9588-xzbMath1286.90144OpenAlexW2113412082MaRDI QIDQ2436686
Publication date: 25 February 2014
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-013-9588-x
unconstrained optimizationgradient methodsnonlinear conjugate gradient methodsoptimal convergence ratenonmonotone algorithmconvex estimate sequence
Uses Software
Cites Work
- Unnamed Item
- Accelerated gradient methods for nonconvex nonlinear and stochastic programming
- Introductory lectures on convex optimization. A basic course.
- A New Active Set Algorithm for Box Constrained Optimization
- Robust Stochastic Approximation Approach to Stochastic Programming
- CUTE
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Nonmonotone Line Search Technique for Newton’s Method
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- The Limited Memory Conjugate Gradient Method
- Benchmarking optimization software with performance profiles.
- Adaptive two-point stepsize gradient algorithm
This page was built for publication: A nonmonotone approximate sequence algorithm for unconstrained nonlinear optimization