On the selection of parameters in Self Scaling Variable Metric Algorithms
From MaRDI portal
Publication:4051920
DOI10.1007/BF01585530zbMath0297.90084MaRDI QIDQ4051920
Publication date: 1974
Published in: Mathematical Programming (Search for Journal in Brave)
Related Items
A quasi-Newton method using a nonquadratic model ⋮ Scaling damped limited-memory updates for unconstrained optimization ⋮ Mechanical system modelling using recurrent neural networks via quasi- Newton learning methods ⋮ A quasi-Newton based pattern search algorithm for unconstrained optimization ⋮ A new BFGS algorithm using the decomposition matrix of the correction matrix to obtain the search directions ⋮ Perspectives on self-scaling variable metric algorithms ⋮ Superlinear convergence of symmetric Huang's class of methods ⋮ Global convergence property of scaled two-step BFGS method ⋮ A brief survey of methods for solving nonlinear least-squares problems ⋮ Matrix conditioning and nonlinear optimization ⋮ An assessment of two approaches to variable metric methods ⋮ A family of variable metric updates ⋮ Some investigations in a new algorithm for nonlinear optimization based on conic models of the objective function ⋮ On measure functions for the self-scaling updating formulae for quasi-newton methods∗ ⋮ On the conditioning of the Hessian approximation in quasi-Newton methods ⋮ Generalized Polak-Ribière algorithm
Cites Work
- Variable metric algorithms: Necessary and sufficient conditions for identical behaviour of nonquadratic functions
- Self-Scaling Variable Metric Algorithms without Line Search for Unconstrained Minimization
- Self-Scaling Variable Metric (SSVM) Algorithms
- Self-Scaling Variable Metric (SSVM) Algorithms
- A Rapidly Convergent Descent Method for Minimization
- A new approach to variable metric algorithms
- On Steepest Descent
- Minimization Algorithms Making Use of Non-quadratic Properties of the Objective Function