Preconditioned conjugate gradient algorithms for nonconvex problems with box constraints
From MaRDI portal
Publication:985699
DOI10.1007/s00211-010-0295-6zbMath1210.65116OpenAlexW2046367659MaRDI QIDQ985699
Tomasz Tarnawski, Radosław Pytlak
Publication date: 6 August 2010
Published in: Numerische Mathematik (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s00211-010-0295-6
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On the limited memory BFGS method for large scale optimization
- Representations of quasi-Newton matrices and their use in limited memory methods
- Global convergence of the method of shortest residuals
- Preconditioned Conjugate Gradient Algorithms for Nonconvex Problems with Box Constraints
- A fast and robust unconstrained optimization method requiring minimum storage
- Global Convergence of a Class of Trust Region Algorithms for Optimization with Simple Bounds
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- On the Identification of Active Constraints
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Updating Quasi-Newton Matrices with Limited Storage
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- On the Goldstein-Levitin-Polyak gradient projection method
- On the Convergence of a New Conjugate Gradient Algorithm
- Algorithm 778: L-BFGS-B
- Numerical Optimization
- On the convergence of conjugate gradient algorithms
- Exposing Constraints
- CUTE
- Line search algorithms with guaranteed sufficient decrease
- An Efficient Algorithm for Large-Scale Nonlinear Programming Problems with Simple Bounds on the Variables
- Newton's Method for Large Bound-Constrained Optimization Problems
- Projected Newton Methods for Optimization Problems with Simple Constraints
- A Limited Memory Algorithm for Bound Constrained Optimization
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Benchmarking optimization software with performance profiles.