Matrix conditioning and nonlinear optimization
From MaRDI portal
Publication:4147882
DOI10.1007/BF01588962zbMath0371.90109OpenAlexW2035079355MaRDI QIDQ4147882
David F. Shanno, Kang Hoh Phua
Publication date: 1978
Published in: Mathematical Programming (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf01588962
Related Items
New implicit updates in multi-step quasi-Newton methods for unconstrained optimisation, Adding variables to quasi-newton Hessian approximations, A limited memory quasi-Newton trust-region method for box constrained optimization, On the performance of switching BFGS/SR1 algorithms for unconstrained optimization, A regularized limited memory BFGS method for large-scale unconstrained optimization and its efficient implementations, Computational experience with known variable metric updates, Computational aspects of large elastoplastic deformations in the presence of anisotropy and plastic spin, Family of optimally conditioned quasi-Newton updates for unconstrained optimization, A trust-region strategy for minimization on arbitrary domains, Scaling damped limited-memory updates for unconstrained optimization, Mechanical system modelling using recurrent neural networks via quasi- Newton learning methods, Maintaining the positive definiteness of the matrices in reduced secant methods for equality constrained optimization, Numerical experience with multiple update quasi-Newton methods for unconstrained optimization, The global convergence of self-scaling BFGS algorithm with non-monotone line search for unconstrained nonconvex optimization problems, Family of projected descent methods for optimization problems with simple bounds, Broyden's quasi-Newton methods for a nonlinear system of equations and unconstrained optimization: a review and open problems, Alternating multi-step quasi-Newton methods for unconstrained optimization, Parallel quasi-Newton methods for unconstrained optimization, A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization, Using function-values in multi-step quasi-Newton methods, A double parameter scaled BFGS method for unconstrained optimization, Extra multistep BFGS updates in quasi-Newton methods, Two modified scaled nonlinear conjugate gradient methods, A new restarting adaptive trust-region method for unconstrained optimization, A double parameter self-scaling memoryless BFGS method for unconstrained optimization, A restarting approach for the symmetric rank one update for unconstrained optimization, Perspectives on self-scaling variable metric algorithms, An assessment of quasi-Newton sparse update techniques for nonlinear structural analysis, New combined method for unconstrained minimization, On large scale nonlinear network optimization, Some numerical experiments with variable-storage quasi-Newton algorithms, On the limited memory BFGS method for large scale optimization, Partitioned variable metric updates for large structured optimization problems, Computational experiments with scaled initial hessian approximation for the broyden family methods∗, A symmetric rank-one method based on extra updating techniques for unconstrained optimization, Convergence analysis of the self-dual optimally conditioned ssvm method of oren-spedicato, A new and dynamic method for unconstrained minimization, Unnamed Item, Low rank updates in preconditioning the saddle point systems arising from data assimilation problems, Cubic regularization in symmetric rank-1 quasi-Newton methods, Global convergence property of scaled two-step BFGS method, Analysis of a self-scaling quasi-Newton method, A rank-one fitting algorithm for unconstrained optimization problems., An efficient method for nonlinearly constrained networks, Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, HARES: An efficient method for first-principles electronic structure calculations of complex systems, On the behaviour of a combined extra-updating/self-scaling BFGS method, Implicit updates in multistep quasi-Newton methods, A nonlinear model for function-value multistep methods, Scaled memoryless symmetric rank one method for large-scale optimization, Numerical comparison of several variable metric algorithms, Wide interval for efficient self-scaling quasi-Newton algorithms, A new arc algorithm for unconstrained optimization, L-Broyden methods: a generalization of the L-BFGS method to the limited-memory Broyden family, Numerical expirience with a class of self-scaling quasi-Newton algorithms, Testing a Class of Methods for Solving Minimization Problems with Simple Bounds on the Variables, How much do approximate derivatives hurt filter methods?, Vectorization of conjugate-gradient methods for large-scale minimization in meteorology, Structured symmetric rank-one method for unconstrained optimization, Unnamed Item, A dense initialization for limited-memory quasi-Newton methods, Scaled nonlinear conjugate gradient methods for nonlinear least squares problems, Impulse noise removal by an adaptive trust-region method, A Kantorovich theorem for the structured PSB update in Hilbert space., The application of optimal control methodology to nonlinear programming problems, Shifted limited-memory variable metric methods for large-scale unconstrained optimization, An example of numerical nonconvergence of a variable-metric method, AN IMPROVED ADAPTIVE TRUST-REGION METHOD FOR UNCONSTRAINED OPTIMIZATION, On diagonally-preconditioning the 2-step BFGS method with accumulated steps for linearly constrained nonlinear programming, Variable metric methods for unconstrained optimization and nonlinear least squares, On diagonally preconditioning the truncated Newton method for super-scale linearly constrained nonlinear prrogramming, Unnamed Item, A compact variable metric algorithm for nonlinear minimax approximation, An efficient implementation of a trust region method for box constrained optimization, Extra updates for the bfgs method∗, A parallel unconstrained quasi-Newton algorithm and its performance on a local memory parallel computer, A Sparse Quasi-Newton Update Derived Variationally with a Nondiagonally Weighted Frobenius Norm, Variationally derived scaling and variable metric updates from the preconvex part of the Broyden family, Sizing the BFGS and DFP updates: Numerical study, Preconditioned low-order Newton methods, A fast and robust unconstrained optimization method requiring minimum storage, The use of alternation and recurrences in two-step quasi-Newton methods, Three-step fixed-point quasi-Newton methods for unconstrained optimisation
Cites Work
- Unnamed Item
- An algorithm that minimizes homogeneous functions of \(n\) variables in \(n + 2\) iterations and rapidly minimizes general functions
- On the selection of parameters in Self Scaling Variable Metric Algorithms
- Self-Scaling Variable Metric (SSVM) Algorithms
- Self-Scaling Variable Metric (SSVM) Algorithms
- Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4]
- Optimally conditioned optimization algorithms without line searches
- Optimal conditioning of self-scaling variable Metric algorithms
- Quasi-Newton Methods and their Application to Function Minimisation
- A Family of Variable-Metric Methods Derived by Variational Means
- The Convergence of a Class of Double-rank Minimization Algorithms
- A new approach to variable metric algorithms
- Conditioning of Quasi-Newton Methods for Function Minimization
- Methods of conjugate directions versus quasi-Newton methods