Matrix conditioning and nonlinear optimization
From MaRDI portal
Publication:4147882
Cites work
- scientific article; zbMATH DE number 3617919 (Why is no real title available?)
- A Family of Variable-Metric Methods Derived by Variational Means
- A new approach to variable metric algorithms
- Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4]
- An algorithm that minimizes homogeneous functions of \(n\) variables in \(n + 2\) iterations and rapidly minimizes general functions
- Conditioning of Quasi-Newton Methods for Function Minimization
- Methods of conjugate directions versus quasi-Newton methods
- On the selection of parameters in Self Scaling Variable Metric Algorithms
- Optimal conditioning of self-scaling variable Metric algorithms
- Optimally conditioned optimization algorithms without line searches
- Quasi-Newton Methods and their Application to Function Minimisation
- Self-Scaling Variable Metric (SSVM) Algorithms
- Self-Scaling Variable Metric (SSVM) Algorithms
- The Convergence of a Class of Double-rank Minimization Algorithms
Cited in
(84)- An assessment of quasi-Newton sparse update techniques for nonlinear structural analysis
- A new restarting adaptive trust-region method for unconstrained optimization
- A regularized limited memory BFGS method for large-scale unconstrained optimization and its efficient implementations
- A limited memory quasi-Newton trust-region method for box constrained optimization
- Computational experiments with scaled initial hessian approximation for the broyden family methods∗
- On the performance of switching BFGS/SR1 algorithms for unconstrained optimization
- Family of projected descent methods for optimization problems with simple bounds
- Mechanical system modelling using recurrent neural networks via quasi- Newton learning methods
- Extra multistep BFGS updates in quasi-Newton methods
- Broyden's quasi-Newton methods for a nonlinear system of equations and unconstrained optimization: a review and open problems
- A Sparse Quasi-Newton Update Derived Variationally with a Nondiagonally Weighted Frobenius Norm
- Analysis of a self-scaling quasi-Newton method
- Adding variables to quasi-newton Hessian approximations
- Variationally derived scaling and variable metric updates from the preconvex part of the Broyden family
- Impulse noise removal by an adaptive trust-region method
- On the behaviour of a combined extra-updating/self-scaling BFGS method
- On large scale nonlinear network optimization
- A compact variable metric algorithm for nonlinear minimax approximation
- scientific article; zbMATH DE number 4180681 (Why is no real title available?)
- Testing a Class of Methods for Solving Minimization Problems with Simple Bounds on the Variables
- An improved adaptive trust-region method for unconstrained optimization
- A trust-region strategy for minimization on arbitrary domains
- Extra updates for the bfgs method∗
- Sizing the BFGS and DFP updates: Numerical study
- A restarting approach for the symmetric rank one update for unconstrained optimization
- An efficient method for nonlinearly constrained networks
- Shifted limited-memory variable metric methods for large-scale unconstrained optimization
- Computational aspects of large elastoplastic deformations in the presence of anisotropy and plastic spin
- Scaled memoryless symmetric rank one method for large-scale optimization
- Wide interval for efficient self-scaling quasi-Newton algorithms
- A dense initialization for limited-memory quasi-Newton methods
- The use of alternation and recurrences in two-step quasi-Newton methods
- Numerical comparison of several variable metric algorithms
- Preconditioned low-order Newton methods
- Numerical expirience with a class of self-scaling quasi-Newton algorithms
- Partitioned variable metric updates for large structured optimization problems
- A symmetric rank-one method based on extra updating techniques for unconstrained optimization
- A parallel unconstrained quasi-Newton algorithm and its performance on a local memory parallel computer
- Computational experience with known variable metric updates
- Two modified scaled nonlinear conjugate gradient methods
- A new arc algorithm for unconstrained optimization
- The global convergence of self-scaling BFGS algorithm with non-monotone line search for unconstrained nonconvex optimization problems
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Family of optimally conditioned quasi-Newton updates for unconstrained optimization
- Scaling damped limited-memory updates for unconstrained optimization
- A double parameter scaled BFGS method for unconstrained optimization
- Perspectives on self-scaling variable metric algorithms
- An efficient implementation of a trust region method for box constrained optimization
- Structured symmetric rank-one method for unconstrained optimization
- The application of optimal control methodology to nonlinear programming problems
- A double parameter self-scaling memoryless BFGS method for unconstrained optimization
- New combined method for unconstrained minimization
- Alternating multi-step quasi-Newton methods for unconstrained optimization
- Some numerical experiments with variable-storage quasi-Newton algorithms
- An example of numerical nonconvergence of a variable-metric method
- On diagonally-preconditioning the 2-step BFGS method with accumulated steps for linearly constrained nonlinear programming
- On diagonally preconditioning the truncated Newton method for super-scale linearly constrained nonlinear prrogramming
- Parallel quasi-Newton methods for unconstrained optimization
- A fast and robust unconstrained optimization method requiring minimum storage
- A new and dynamic method for unconstrained minimization
- On the limited memory BFGS method for large scale optimization
- New implicit updates in multi-step quasi-Newton methods for unconstrained optimisation
- Implicit updates in multistep quasi-Newton methods
- Variable metric methods for unconstrained optimization and nonlinear least squares
- Three-step fixed-point quasi-Newton methods for unconstrained optimisation
- Using function-values in multi-step quasi-Newton methods
- On quasi-Newton methods in fast Fourier transform-based micromechanics
- scientific article; zbMATH DE number 3812857 (Why is no real title available?)
- A Kantorovich theorem for the structured PSB update in Hilbert space.
- L-Broyden methods: a generalization of the L-BFGS method to the limited-memory Broyden family
- Scaled nonlinear conjugate gradient methods for nonlinear least squares problems
- A nonlinear model for function-value multistep methods
- How much do approximate derivatives hurt filter methods?
- Low rank updates in preconditioning the saddle point systems arising from data assimilation problems
- Numerical experience with multiple update quasi-Newton methods for unconstrained optimization
- A rank-one fitting algorithm for unconstrained optimization problems.
- HARES: An efficient method for first-principles electronic structure calculations of complex systems
- Maintaining the positive definiteness of the matrices in reduced secant methods for equality constrained optimization
- Vectorization of conjugate-gradient methods for large-scale minimization in meteorology
- A trust-region method using extended nonmonotone technique for unconstrained optimization
- Global convergence property of scaled two-step BFGS method
- Convergence analysis of the self-dual optimally conditioned ssvm method of oren-spedicato
- Cubic regularization in symmetric rank-1 quasi-Newton methods
- A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization
This page was built for publication: Matrix conditioning and nonlinear optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4147882)