Smooth nonlinear optimization of \(\mathbb R^n\) (Q1374463)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Smooth nonlinear optimization of \(\mathbb R^n\) |
scientific article |
Statements
Smooth nonlinear optimization of \(\mathbb R^n\) (English)
0 references
8 December 1997
0 references
The main purpose of this book is to motivate the use of tools from differential geometry in nonlinear optimization. The first two chapters are of introductory type. Chapter 3 is concerned with necessary and sufficient conditions for optimality, whereas in Chapter 4 the geometric background of such conditions is studied. The derivation in Chapter 3 is of differential geometric nature; it induces the classical first- and second-order optimality conditions, as is shown in Chapter 5. In Chapter 6 the concept of convexity is generalized by means of geodesic convexity. In fact, a subset \(A\) of a Riemannian manifold is called geodesically convex if any two points of \(A\) can be joined by means of a geodesic in \(A\). Geodesic convexity of functions is defined via geodesic arcs, using the arc length parametrization. For geodesically convex problems it holds that a local minimum is global as well. Among other results, characterizations of first and second order for geodesic convexity of a function are given. Geodesic convexity also plays a basic role in Chapter 7 on the connectedness of the solution set to complementarity systems. Chapter 8 is devoted to nonlinear coordinate representations: nonlinear coordinate transformations in smooth optimization are studied from the viewpoint of variable metric algorithms. In particular, the relationship between metrics, projections and right inverses is clarified. In Chapter 9, tensor approximations of smooth functions on Riemannian manifolds are considered, and geodesic convexity is characterized by using tensor calculus. Geodesic convexity of the positive orthant with respect to various Riemannian metrics (including the affine metric) is studied in Chapter 10. Chapter 11 deals with variable metric methods along geodesics. For two methods (differing in stepsize control) a general convergence theorem for finding stationary points is presented, and the rate of convergence is studied. In Chapter 12, the (polynomial) behaviour of interior point algorithms is analysed by means of a variable metric approach. As examples, the Riemannian metric for the affine- and projective scaling vector field is discussed. Chapter 13 is concerned with special function classes (e.g., geodesically quasi-, pseudoconvex functions, difference of two geodesically convex functions). In Chapter 14, Fenchel's problem of level sets is solved in the smooth case. Finally, three appendices round off the book: (A) on the connection between mechanics and optimization, (B) (set-) topology, (C) Riemannian geometry. The book is well written, easy to read and it contains many interesting details. Altogether, it forms a valuable contribution to the literature on optimization theory.
0 references