An algorithm that minimizes homogeneous functions of n variables in n + 2 iterations and rapidly minimizes general functions
From MaRDI portal
Publication:2540947
Cites work
- scientific article; zbMATH DE number 3329468 (Why is no real title available?)
- scientific article; zbMATH DE number 3352737 (Why is no real title available?)
- scientific article; zbMATH DE number 3407464 (Why is no real title available?)
- A Rapidly Convergent Descent Method for Minimization
- An Algorithm for the Calculation of the Pseudo-Inverse of a Singular Matrix
- Computational experience with quadratically convergent minimisation methods
- Quasi-Newton Methods and their Application to Function Minimisation
- Variance algorithm for minimization
Cited in
(15)- The effect of data grid size on certain interpolation methods for unconstrained function minimization
- A homogeneous method for unconstrained optimization
- Differential gradient methods
- Some computational advances in unconstrained optimization
- Fast Givens transformtions applied to the homogeneous optimization method
- Unconstrained optimization based on homogeneous models
- A modified homogeneous algorithm for function minimization
- A new conic method for unconstrained minimization
- Parallel algorithms for nonlinear programming problems
- A Newton-type curvilinear search method for optimization
- Nonlinear programming on a microcomputer
- Matrix conditioning and nonlinear optimization
- A numerically stable reduced-gradient type algorithm for solving large- scale linearly constrained minimization problems
- A robust conjugate-gradient algorithm which minimizes L-functions
- Approximation methods for the unconstrained optimization
This page was built for publication: An algorithm that minimizes homogeneous functions of \(n\) variables in \(n + 2\) iterations and rapidly minimizes general functions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2540947)