On the selection of parameters in Self Scaling Variable Metric Algorithms
From MaRDI portal
Publication:4051920
Cites work
- A Rapidly Convergent Descent Method for Minimization
- A new approach to variable metric algorithms
- Minimization Algorithms Making Use of Non-quadratic Properties of the Objective Function
- On Steepest Descent
- Self-Scaling Variable Metric (SSVM) Algorithms
- Self-Scaling Variable Metric (SSVM) Algorithms
- Self-Scaling Variable Metric Algorithms without Line Search for Unconstrained Minimization
- Variable metric algorithms: Necessary and sufficient conditions for identical behaviour of nonquadratic functions
Cited in
(16)- Generalized Polak-Ribière algorithm
- Mechanical system modelling using recurrent neural networks via quasi- Newton learning methods
- A new BFGS algorithm using the decomposition matrix of the correction matrix to obtain the search directions
- An assessment of two approaches to variable metric methods
- A family of variable metric updates
- A quasi-Newton based pattern search algorithm for unconstrained optimization
- On the conditioning of the Hessian approximation in quasi-Newton methods
- On measure functions for the self-scaling updating formulae for quasi-newton methods∗
- Scaling damped limited-memory updates for unconstrained optimization
- Perspectives on self-scaling variable metric algorithms
- Superlinear convergence of symmetric Huang's class of methods
- Global convergence property of scaled two-step BFGS method
- Some investigations in a new algorithm for nonlinear optimization based on conic models of the objective function
- Matrix conditioning and nonlinear optimization
- A quasi-Newton method using a nonquadratic model
- A brief survey of methods for solving nonlinear least-squares problems
This page was built for publication: On the selection of parameters in Self Scaling Variable Metric Algorithms
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4051920)