Preconditioned conjugate gradient algorithms for nonconvex problems with box constraints
From MaRDI portal
Publication:985699
DOI10.1007/S00211-010-0295-6zbMATH Open1210.65116OpenAlexW2046367659MaRDI QIDQ985699FDOQ985699
Authors: Radosław Pytlak, Tomasz Tarnawski
Publication date: 6 August 2010
Published in: Numerische Mathematik (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s00211-010-0295-6
Recommendations
- Preconditioned conjugate gradient algorithms for nonconvex problems with box constraints
- Preconditioned conjugate gradient algorithms for nonconvex problems
- A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Conjugate gradient algorithms in nonconvex optimization
- Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
Cites Work
- Algorithm 778: L-BFGS-B
- Newton's Method for Large Bound-Constrained Optimization Problems
- CUTE
- Numerical Optimization
- A Limited Memory Algorithm for Bound Constrained Optimization
- Title not available (Why is that?)
- Benchmarking optimization software with performance profiles.
- On the limited memory BFGS method for large scale optimization
- Updating Quasi-Newton Matrices with Limited Storage
- Line search algorithms with guaranteed sufficient decrease
- On the Convergence of a New Conjugate Gradient Algorithm
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Title not available (Why is that?)
- Representations of quasi-Newton matrices and their use in limited memory methods
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Title not available (Why is that?)
- Global Convergence of a Class of Trust Region Algorithms for Optimization with Simple Bounds
- On the Identification of Active Constraints
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- On the Goldstein-Levitin-Polyak gradient projection method
- Projected Newton Methods for Optimization Problems with Simple Constraints
- Exposing Constraints
- Global convergence of the method of shortest residuals
- On the convergence of conjugate gradient algorithms
- Preconditioned conjugate gradient algorithms for nonconvex problems with box constraints
- A fast and robust unconstrained optimization method requiring minimum storage
- Title not available (Why is that?)
- An Efficient Algorithm for Large-Scale Nonlinear Programming Problems with Simple Bounds on the Variables
- Preconditioned conjugate gradient algorithms for nonconvex problems
Cited In (4)
- Tackling box-constrained optimization via a new projected quasi-Newton approach
- Preconditioned conjugate gradient algorithms for nonconvex problems with box constraints
- Preconditioned conjugate gradient algorithms for nonconvex problems
- A new nonmonotone spectral projected gradient algorithm for box-constrained optimization problems in \(m \times n\) real matrix space with application in image clustering
Uses Software
This page was built for publication: Preconditioned conjugate gradient algorithms for nonconvex problems with box constraints
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q985699)