A modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization (Q2346397)

From MaRDI portal
scientific article
Language Label Description Also known as
English
A modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization
scientific article

    Statements

    A modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization (English)
    0 references
    0 references
    0 references
    1 June 2015
    0 references
    The paper is concerned with the unconstrained minimization problem \(\min f(x)\), \(x\in \mathbb R^n\), where \(\| f^{\prime}(x)-f^{\prime}(y)\| \leq L\| x-y\|\) for all \(x,y\in \mathbb R^n\). The authors study a modification of the quasi--Newton method \[ x_{k+1}=x_k+\alpha_k d_k, \;d_{k+1}=-H_{k+1}f^{\prime}(x_{k+1}), \] \[ H_{k+1}=\tau_k^{-1} (I-(s_k^T y_k)^{-1}(s_k y_k^T+y_k s_k^T))+(1+\tau_k^{-1} (s_k^T y_k)^{-1} \| y_k\|^2) (s_k^T y_k)^{-1} s_k s_k^T, \] where \(s_k=x_{k+1}-x_k\), \(y_k=f^{\prime}(x_{k+1})-f^{\prime}(x_k)\), \(\tau_k \in [\| s_k\|^{-2} s_k^T y_k, (s_k^T y_k)^{-1} \| y_k \|^2]\), and the stepsize \(\alpha_k>0\) is obtained by some line search. A convergence analysis of the modified method is carried out for convex and nonconvex functions \(f(x)\). Numerical experiments are also discussed.
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    unconstrained optimization
    0 references
    quasi-Newton method
    0 references
    conjugate gradient method
    0 references
    global convergence
    0 references
    improved Wolfe line search
    0 references
    0 references
    0 references
    0 references