Recent advances in unconstrained optimization
From MaRDI portal
Publication:5636700
DOI10.1007/BF01584071zbMath0228.90042OpenAlexW2021350363MaRDI QIDQ5636700
Publication date: 1971
Published in: Mathematical Programming (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf01584071
Related Items
Aggregate production planning by stochastic control, Exact penalty functions in nonlinear programming, A Monte Carlo study of estimators of stochastic frontier production functions, Between Newton and Cauchy: the diagonal variable metric method; theory and test results, Conjugate gradient methods using quasi-Newton updates with inexact line searches, Cubic regularization in symmetric rank-1 quasi-Newton methods, Practical convergence conditions for the Davidon-Fletcher-Powell method, Local convergence of the steepest descent method in Hilbert spaces, Estimation in a disequilibrium model and the value of information, Alternative parameter estimators based upon grouped data, An assessment of two approaches to variable metric methods, A conjugate direction algorithm without line searches, Unified approach to unconstrained minimization via basic matrix factorizations, Time evolutional analysis of nonlinear structures, On optimal regulation of a storage level with application to the water level regulation of a lake, Some simplified algorithms for the bayesian identification of aircraft parameters, Optimal simultaneous maximuma posterioriestimation of states, noise statistics and parameters I. Algorithm, Self-Scaling Variable Metric Algorithms without Line Search for Unconstrained Minimization, On the connection between the conjugate gradient method and quasi-Newton methods on quadratic problems
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Triangular factors of modified matrices
- An effective algorithm for minimization
- Memory gradient method for the minimization of functions
- Relation between the memory gradient method and the Fletcher-Reeves method
- Properties of the conjugate-gradient and Davidon methods
- Variable Metric Method for Minimization
- A Rapidly Convergent Descent Method for Minimization
- Function minimization by conjugate gradients
- An efficient method for finding the minimum of a function of several variables without calculating derivatives
- Maximization by Quadratic Hill-Climbing
- A Comparison of Several Current Optimization Methods, and the use of Transformations in Constrained Problems
- A minimal point of a finite metric set
- Quasi-Newton Methods and their Application to Function Minimisation
- Variance algorithm for minimization
- On the Relative Efficiencies of Gradient Methods
- On a Numerical Instability of Davidon-Like Methods
- Convergence Conditions for Ascent Methods
- Lattice Approximations to the Minima of Functions of Several Variables
- The Local Dependence of Least Squares Cubic Splines
- Minimizing a function without calculating derivatives
- A Survey of Numerical Methods for Unconstrained Optimization
- A Family of Variable-Metric Methods Derived by Variational Means
- Computational experience with quadratically convergent minimisation methods
- Comparison of Gradient Methods for the Solution of Nonlinear Parameter Estimation Problems
- Variations on Variable-Metric Methods
- Variable metric methods of minimisation
- The Convergence of a Class of Double-rank Minimization Algorithms
- A new approach to variable metric algorithms
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- Conditioning of Quasi-Newton Methods for Function Minimization
- Optimal Conditioning of Quasi-Newton Methods
- A New Algorithm for Unconstrained Optimization
- A Modification of Davidon's Minimization Method to Accept Difference Approximations of Derivatives