Optimization theory and methods. Nonlinear programming (Q2500511): Difference between revisions
From MaRDI portal
Changed an Item |
Changed an Item |
||
Property / describes a project that uses | |||
Property / describes a project that uses: L-BFGS / rank | |||
Normal rank |
Revision as of 23:19, 28 February 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Optimization theory and methods. Nonlinear programming |
scientific article |
Statements
Optimization theory and methods. Nonlinear programming (English)
0 references
17 August 2006
0 references
Optimization is a subject that is widely and increasingly used in science, engineering, economics, management, industry, and other areas. It deals with selecting the best of many possible decisions in real-life environment, constructing computational methods to find optimal solutions, exploring the theoretical properties, and studying the computational performance of numerical algorithms implemented based on computational methods. Along with the rapid development of high-performance computers and progress of computational methods, more and more large-scale optimization problems have been studied and solved. As pointed out by Professor Yuqi He of Harvard University, a member of the US National Academy of Engineering, `optimization is a cornerstone for the development of civilization'. This book systematically introduces optimization theory and methods, discusses in detail optimality conditions, and develops computational methods for unconstrained, constrained, and nonsmooth optimization. This book mainly studies solving unconstrained optimization problem and constrained optimization problem from the view points of both theory and numerical methods. The book contains a lot of recent research results on nonlinear programming including those of the authors, for example, line search, Newton's methods, self-scaling variable metric method, conjugate gradient method, (non-)quasi-Newton methods, trust-region and conic model methods, nonlinear least-squares problems, quadratic programming, penalty function methods, feasible direction methods, sequential quadratic programming. In the last chapter, a simple and comprehensive introduction to nonsmooth optimization is given. The book is self-contained, systematic in theory and algorithms, and easy to read. For most methods, the authors motivate the idea, study the derivation, establish the global and local convergence, and indicate the efficiency and reliability of the numerical performance. The book also contains an extensive, not complete, bibliography with 366 references, which is an important part of the book, and is useful to readers for their further studies. Each chapter ends with an exercise set. Many exercises involve filling in details of proofs in the text or proving results that are needed elsewhere in the text, while others supplement the text by exploring examples or additional material. In the Appendix, test functions for unconstrained and constrained optimization problems are given.
0 references
line search
0 references
(inexact) Newton method
0 references
self-scaling variable metric method
0 references
(non-)quasi-Newton methods
0 references
conjugate gradient method
0 references
trust-region and conic model methods
0 references
nonlinear least-squa\-res problems
0 references
theory of constrained optimization
0 references
quadratic programming
0 references
penalty function methods
0 references
feasible direction methods
0 references
sequential quadratic programming
0 references
trust-region methods for constrained problems
0 references
nonsmooth optimization
0 references