On Steepest Descent
From MaRDI portal
Publication:5626157
DOI10.1137/0303013zbMath0221.65094OpenAlexW2062106836MaRDI QIDQ5626157
Publication date: 1965
Published in: Journal of the Society for Industrial and Applied Mathematics Series A Control (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/0303013
Numerical mathematical programming methods (65K05) Mathematical programming (90C99) Numerical analysis in abstract spaces (65J99)
Related Items
Convergence of quasi-Newton method with new inexact line search, Conjugate gradient algorithms in nonlinear structural analysis problems, Unnamed Item, On the convergence of interior-reflective Newton methods for nonlinear minimization subject to bounds, A generalized conditional gradient method and its connection to an iterative shrinkage method, On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients, Efficent line search algorithm for unconstrained optimization, Variable metric gradient projection processes in convex feasible sets defined by nonlinear inequalities, Properties of the sequential gradient-restoration algorithm (SGRA). II: Convergence analysis, Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization, Adaptive three-term PRP algorithms without gradient Lipschitz continuity condition for nonconvex functions, Efficient hybrid conjugate gradient techniques, Convergence of the Polak-Ribiére-Polyak conjugate gradient method, Hybridization rule applied on accelerated double step size optimization scheme, Convergence and stability of line search methods for unconstrained optimization, A new class of nonlinear conjugate gradient coefficients with global convergence properties, Global convergence of the gradient method for functions definable in o-minimal structures, An Efficient and Robust Scalar Auxialiary Variable Based Algorithm for Discrete Gradient Systems Arising from Optimizations, On the nonmonotonicity degree of nonmonotone line searches, Global convergence properties of the BBB conjugate gradient method, Sign projected gradient flow: a continuous-time approach to convex optimization with linear equality constraints, On the selection of parameters in Self Scaling Variable Metric Algorithms, A survey of gradient methods for solving nonlinear optimization, The hybrid BFGS-CG method in solving unconstrained optimization problems, A class of gradient unconstrained minimization algorithms with adaptive stepsize, Choice of a step-length in an almost everywhere differentiable (on every direction) (almost everywhere locally lipschitz) lower-semi-continuous minimization problem, A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches, A natural vector/matrix notation applied in an efficient and robust return-mapping algorithm for advanced yield functions, Convergence of PRP method with new nonmonotone line search, Memory gradient method with Goldstein line search, Hybridization of accelerated gradient descent method, Optimal conditioning of self-scaling variable Metric algorithms, Levenberg-Marquardt method for solving systems of absolute value equations, Modified nonmonotone Armijo line search for descent method, Line search methods with guaranteed asymptotical convergence to an improving local optimum of multimodal functions, A new nonmonotone line search technique for unconstrained optimization, On recursive averaging processes and Hilbert space extensions of the contraction mapping principle, On step-size estimation of line search methods, Convergence of nonmonotone line search method, Convexity, monotonicity, and gradient processes in Hilbert space, Curvilinear path steplength algorithms for minimization which use directions of negative curvature, An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion, Conditional gradient algorithms with open loop step size rules, Accelerated gradient descent methods with line search, Nonoptimal termination properties of quadratic interpolation univariate searches, In favor of conjugate directions: a generalized acceptable-point algorithm for function minimization, Composing Scalable Nonlinear Algebraic Solvers, Optimization of lipschitz continuous functions, An effective algorithm for minimization, A generalized conjugate gradient algorithm, A multi-local optimization algorithm, Computer Algebra and Line Search, A comparison of nonlinear optimization methods for supervised learning in multilayer feedforward neural networks, Ein iteratives Verfahren zur Bestimmung einer Lösung gewisser nichtlinearer Operatorgleichungen im Hilbertraum mit Anwendung auf Hammersteinsche Integralgleichungssysteme, A globalization procedure for solving nonlinear systems of equations, Self-Scaling Variable Metric Algorithms without Line Search for Unconstrained Minimization, Optimization approach for the Monge-Ampère equation, On the convergence of projected gradient processes to singular critical points, Acceleration of conjugate gradient algorithms for unconstrained optimization, A \(q\)-Polak-Ribière-Polyak conjugate gradient algorithm for unconstrained optimization problems, A spectral KRMI conjugate gradient method under the strong-Wolfe line search, An application of the generalized Aitken-Steffensen method to the problem of minimizing a function, MERLIN-3. 0. A multidimensional optimization environment, A discrete Newton algorithm for minimizing a function of many variables, Constrained linear quadratic Gaussian control with process applications, An efficient line search for nonlinear least squares, A self-tuning regulator based on optimal output feedback theory, The higher-order Levenberg–Marquardt method with Armijo type line search for nonlinear equations, A new class of nonlinear conjugate gradient coefficients for unconstrained optimization, Convergence of descent method with new line search, Variable metric random pursuit