Optimal conditioning of self-scaling variable Metric algorithms
From MaRDI portal
Publication:4110808
DOI10.1007/BF01580654zbMath0342.90045OpenAlexW2061054410MaRDI QIDQ4110808
Emilio Spedicato, Shmuel S. Oren
Publication date: 1976
Published in: Mathematical Programming (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf01580654
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Newton-type methods (49M15)
Related Items (97)
A hybrid scaling parameter for the scaled memoryless BFGS method based on the ℓ∞ matrix norm ⋮ Numerical simulation of tridimensional electromagnetic shaping of liquid metals ⋮ Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update ⋮ Computational experience with known variable metric updates ⋮ A descent hybrid conjugate gradient method based on the memoryless BFGS update ⋮ Family of optimally conditioned quasi-Newton updates for unconstrained optimization ⋮ A quasi-Newton method using a nonquadratic model ⋮ A modified scaling parameter for the memoryless BFGS updating formula ⋮ Numerical experiments with variations of the Gauss-Newton algorithm for nonlinear least squares ⋮ A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update ⋮ A trust-region strategy for minimization on arbitrary domains ⋮ Scaling damped limited-memory updates for unconstrained optimization ⋮ Mechanical system modelling using recurrent neural networks via quasi- Newton learning methods ⋮ Maintaining the positive definiteness of the matrices in reduced secant methods for equality constrained optimization ⋮ Variationally derived algorithms in the ABS class for linear systems ⋮ The global convergence of self-scaling BFGS algorithm with non-monotone line search for unconstrained nonconvex optimization problems ⋮ Two accelerated nonmonotone adaptive trust region line search methods ⋮ Two extensions of the Dai-Liao method with sufficient descent property based on a penalization scheme ⋮ Broyden's quasi-Newton methods for a nonlinear system of equations and unconstrained optimization: a review and open problems ⋮ Nonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty model ⋮ A modified Dai-Kou-type method with applications to signal reconstruction and blurred image restoration ⋮ A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization ⋮ Using function-values in multi-step quasi-Newton methods ⋮ Symmetric Perry conjugate gradient method ⋮ Two modified scaled nonlinear conjugate gradient methods ⋮ On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae ⋮ An Accelerated Three-Term Extension of a Descent Nonlinear Conjugate Gradient Method ⋮ New conjugate gradient algorithms based on self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method ⋮ A double parameter self-scaling memoryless BFGS method for unconstrained optimization ⋮ A projection-based derivative free DFP approach for solving system of nonlinear convex constrained monotone equations with image restoration applications ⋮ A modified scaled memoryless symmetric rank-one method ⋮ Eigenvalue analyses on the memoryless Davidon-Fletcher-Powell method based on a spectral secant equation ⋮ A NONMONOTONE ADMM-BASED DIAGONAL QUASI-NEWTON UPDATE WITH APPLICATION TO THE COMPRESSIVE SENSING PROBLEM ⋮ Computational experience with conjugate gradient algorithms ⋮ An Optimal Broyden Updating Formula And Its Application To Nonliner Least Squares ⋮ A descent family of the spectral Hestenes–Stiefel method by considering the quasi-Newton method ⋮ An approximate Newton-type proximal method using symmetric rank-one updating formula for minimizing the nonsmooth composite functions ⋮ A diagonally scaled Newton-type proximal method for minimization of the models with nonsmooth composite cost functions ⋮ A scaled nonlinear conjugate gradient algorithm for unconstrained optimization ⋮ Perspectives on self-scaling variable metric algorithms ⋮ Gaussian processes for history-matching: application to an unconventional gas reservoir ⋮ New Basic Hessian Approximations for Large-Scale Nonlinear Least-Squares Optimization ⋮ New combined method for unconstrained minimization ⋮ Some numerical experiments with variable-storage quasi-Newton algorithms ⋮ Quasi-Newton-Verfahren vom Rang-Eins-Typ zur Lösung unrestringierter Minimierungsprobleme. I: Verfahren und grundlegende Eigenschaften ⋮ Quasi-Newton-Verfahren vom Rang-Eins-Typ zur Lösung unrestringierter Minimierungsprobleme. II: n-Schritt-quadratische Konvergenz für Restart-Varianten ⋮ Superlinear convergence of symmetric Huang's class of methods ⋮ A class of self-dual updating formulae in the broyden family ⋮ Partitioned variable metric updates for large structured optimization problems ⋮ A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization ⋮ A combined class of self-scaling and modified quasi-Newton methods ⋮ Computational experiments with scaled initial hessian approximation for the broyden family methods∗ ⋮ Conjugate gradient methods using quasi-Newton updates with inexact line searches ⋮ Unnamed Item ⋮ An adaptive nonmonotone trust region algorithm ⋮ Convergence analysis of the self-dual optimally conditioned ssvm method of oren-spedicato ⋮ Convergence acceleration of direct trajectory optimization using novel Hessian calculation methods ⋮ Low rank updates in preconditioning the saddle point systems arising from data assimilation problems ⋮ A new accelerated diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization ⋮ Cubic regularization in symmetric rank-1 quasi-Newton methods ⋮ Global convergence property of scaled two-step BFGS method ⋮ Analysis of the maximum magnification by the scaled memoryless DFP updating formula with application to compressive sensing ⋮ Erratum to: Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization ⋮ A class of diagonal preconditioners for limited memory BFGS method ⋮ Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length ⋮ Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization ⋮ A variable metric-method for function minimization derived from invariancy to nonlinear scaling ⋮ A bound to the condition number of canonical rank-two corrections and applications to the variable metric method ⋮ On the behaviour of a combined extra-updating/self-scaling BFGS method ⋮ Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization ⋮ Matrix conditioning and nonlinear optimization ⋮ An assessment of two approaches to variable metric methods ⋮ New accelerated conjugate gradient algorithms as a modification of Dai-Yuan's computational scheme for unconstrained optimization ⋮ Scaled memoryless symmetric rank one method for large-scale optimization ⋮ A modified nonmonotone trust region line search method ⋮ An adaptive family of projection methods for constrained monotone nonlinear equations with applications ⋮ An augmented memoryless BFGS method based on a modified secant equation with application to compressed sensing ⋮ Superlinear convergence of nonlinear conjugate gradient method and scaled memoryless BFGS method based on assumptions about the initial point ⋮ Wide interval for efficient self-scaling quasi-Newton algorithms ⋮ On a conjecture of Dixon and other topics in variable metric methods ⋮ L-Broyden methods: a generalization of the L-BFGS method to the limited-memory Broyden family ⋮ A new descent spectral Polak-Ribière-Polyak method based on the memoryless BFGS update ⋮ Numerical expirience with a class of self-scaling quasi-Newton algorithms ⋮ Optimal conditioning in the convex class of rank two updates ⋮ A new gradient method via quasi-Cauchy relation which guarantees descent ⋮ Computational experience with bank-one positive definite quasi-newton algorithms ⋮ Unnamed Item ⋮ An adaptive nonmonotone trust region method based on a modified scalar approximation of the Hessian in the successive quadratic subproblems ⋮ Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing ⋮ QN-like variable storage conjugate gradients ⋮ Some investigations in a new algorithm for nonlinear optimization based on conic models of the objective function ⋮ Variable metric methods for unconstrained optimization and nonlinear least squares ⋮ On measure functions for the self-scaling updating formulae for quasi-newton methods∗ ⋮ On the conditioning of the Hessian approximation in quasi-Newton methods ⋮ Variationally derived scaling and variable metric updates from the preconvex part of the Broyden family ⋮ Sizing the BFGS and DFP updates: Numerical study ⋮ A fast and robust unconstrained optimization method requiring minimum storage
Cites Work
- Unnamed Item
- A bound to the condition number of canonical rank-two corrections and applications to the variable metric method
- Self-Scaling Variable Metric Algorithms without Line Search for Unconstrained Minimization
- Self-Scaling Variable Metric (SSVM) Algorithms
- Self-Scaling Variable Metric (SSVM) Algorithms
- A new approach to variable metric algorithms
- On Steepest Descent
- Optimal Conditioning of Quasi-Newton Methods
This page was built for publication: Optimal conditioning of self-scaling variable Metric algorithms