Optimally conditioned optimization algorithms without line searches
From MaRDI portal
Publication:4094661
DOI10.1007/BF01681328zbMath0328.90055MaRDI QIDQ4094661
Publication date: 1975
Published in: Mathematical Programming (Search for Journal in Brave)
Related Items (57)
Computational experience with known variable metric updates ⋮ The least prior deviation quasi-Newton update ⋮ Family of optimally conditioned quasi-Newton updates for unconstrained optimization ⋮ The linear algebra of block quasi-Newton algorithms ⋮ Minimum Norm Symmetric Quasi-Newton Updates Restricted to Subspaces ⋮ A Bench Mark Experiment for Minimization Algorithms ⋮ Inertia-preserving secant updates ⋮ Transformation of uniformly distributed particle ensembles ⋮ Parallel quasi-Newton methods for unconstrained optimization ⋮ Performance of various BFGS implementations with limited precision second-order information ⋮ A Broyden Class of Quasi-Newton Methods for Riemannian Optimization ⋮ A generalized direct search acceptable-point technique for use with descent-type multivariate algorithms ⋮ An Optimal Broyden Updating Formula And Its Application To Nonliner Least Squares ⋮ A product positioning model with costs and prices ⋮ New combined method for unconstrained minimization ⋮ Updating conjugate directions by the BFGS formula ⋮ On the distribution of the likelihood ratio test statistic for a mixture of two normal distributions ⋮ Superlinear convergence of symmetric Huang's class of methods ⋮ Estimating matrices ⋮ A symmetric rank-one method based on extra updating techniques for unconstrained optimization ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Unnamed Item ⋮ An adaptive competitive penalty method for nonsmooth constrained optimization ⋮ Minimization Algorithms for Functions with Random Noise ⋮ Some notes on the quasi-Newton methods ⋮ Matrix conditioning and nonlinear optimization ⋮ A conjugate direction algorithm without line searches ⋮ Unified approach to unconstrained minimization via basic matrix factorizations ⋮ Numerical comparison of several variable metric algorithms ⋮ Generation of classes of symmetric rank-2 secant updates and the maximality of the Davidon class ⋮ An algorithm for minimizing a differentiable function subject to box constraints and errors ⋮ Some remarks on the symmetric rank-one update ⋮ Two new unconstrained optimization algorithms which use function and gradient values ⋮ Combined lp and quasi-Newton methods for minimax optimization ⋮ A new arc algorithm for unconstrained optimization ⋮ Analysis of a new algorithm for one-dimensional minimization ⋮ A family of variable metric updates ⋮ Generating conjugate directions without line searches using factorized variable metric updating formulas ⋮ Techniques for nonlinear least squares and robust regression ⋮ On a conjecture of Dixon and other topics in variable metric methods ⋮ On averaging and representation properties of the BFGS and related secant updates ⋮ Quadratic termination properties of Davidon's new variable metric algorithm ⋮ The revised DFP algorithm without exact line search ⋮ Optimal conditioning in the convex class of rank two updates ⋮ Differential optimization techniques ⋮ Structured symmetric rank-one method for unconstrained optimization ⋮ Unnamed Item ⋮ Modules ⋮ Some investigations in a new algorithm for nonlinear optimization based on conic models of the objective function ⋮ The coordinex problem and its relation to the conjecture of Wilkinson ⋮ Variable metric methods for unconstrained optimization and nonlinear least squares ⋮ An alternative variational principle for variable metric updating ⋮ A parallel unconstrained quasi-Newton algorithm and its performance on a local memory parallel computer ⋮ Variational quasi-Newton methods for unconstrained optimization ⋮ A CLASS OF DFP ALGORITHMS WITH REVISED SEARCH DIRECTION ⋮ A fast and robust unconstrained optimization method requiring minimum storage
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- An Iterative Method for Finding Stationary Values of a Function of Several Variables
- Variable Metric Method for Minimization
- A Rapidly Convergent Descent Method for Minimization
- A Comparison of Several Current Optimization Methods, and the use of Transformations in Constrained Problems
- A minimal point of a finite metric set
- A Family of Variable-Metric Methods Derived by Variational Means
- A new approach to variable metric algorithms
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- Conditioning of Quasi-Newton Methods for Function Minimization
- Quasi-newton algorithms generate identical points
- Rank-one and Rank-two Corrections to Positive Definite Matrices Expressed in Product Form
This page was built for publication: Optimally conditioned optimization algorithms without line searches