On a certain continuous minimization method with a variable metric (Q1380188)

From MaRDI portal
scientific article
Language Label Description Also known as
English
On a certain continuous minimization method with a variable metric
scientific article

    Statements

    On a certain continuous minimization method with a variable metric (English)
    0 references
    9 March 1998
    0 references
    This article studies the scaled gradient projection method for the minimization of a Fréchet differentiable, convex function \(f\) over a closed convex set \(Q\) in a Hilbert space \(H\). The \(k\)th iteration of the scaled gradient projection method is given by \(x_{k+1} = \Pi_Q^{G(x_k)}( x_k - \gamma_k G^{-1}(x_k) f'(x_k))\), where \(f'(x_k)\) is the gradient of \(f\) at \(x_k\), \(\Pi_Q^{G(x_k)}\) is the projection onto \(Q\) with respect to the metric \(\| v \| _{G(x_k)}^2 = \langle v, G(x_k) v \rangle\) induced by the self-adjoint positive definite operator \(G(x_k)\), and \(\gamma > 0\) is the step-size. To obtain good convergence rates non-diagonal scalings \(G(x_k)\) must often be used. In this case, however, the computation of \(x_{k+1}\) can be very expensive. Therefore extensions of these methods or interior point methods are typically used in practice. For a textbook overview of (scaled) gradient projection methods see Sections 2.3, 2.4, 2.8 of \textit{D. P. Bertsekas} [Nonlinear Programming, Athena Scientific, Belmont, MA (1995)]. This paper investigates the dynamical system \(\dot{x}(t) + x(t) = \Pi_Q^{G(x)}( x(t) - \gamma(t) G^{-1}(x(t)) f'(x(t)))\), \(\gamma(t) > 0\), \(x(0) = x_0\), corresponding to the scaled gradient projection method. Conditions are established that guarantee the existence of \(x_\infty \in Q\) such that \(\lim_{t \rightarrow \infty} \| x(t) - x_\infty \| + \| \dot{x}(t) \| = 0\). In the presence of inexact gradient information, Tikhonov regularization in which \(f(x(t))\) and \(f'(x(t))\) are replaced by \(T(x(t),t) = f(x(t)) + (\alpha(t)/2) \| x(t) \| ^2\), \(T'(x(t),t) = f'(x(t)) + \alpha(t) x(t)\), respectively, is studied and convergence to the minimum norm solution of \(\min_{x \in Q} f(x)\) is established.
    0 references
    gradient projection method
    0 references
    convex constraints
    0 references
    convex objective function
    0 references

    Identifiers