Convergence Conditions for Ascent Methods

From MaRDI portal
Publication:5566849

DOI10.1137/1011036zbMath0177.20603OpenAlexW1988849934WikidataQ56560305 ScholiaQ56560305MaRDI QIDQ5566849

Philip Wolfe

Publication date: 1969

Published in: SIAM Review (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1137/1011036



Related Items

Stopping criteria for linesearch methods without derivatives, Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update, Nonlinear conjugate gradient methods for the optimal control of laser surface hardening, Gaussian process regression for maximum entropy distribution, Accelerated memory-less SR1 method with generalized secant equation for unconstrained optimization, Descent methods for composite nondifferentiable optimization problems, Two efficient modifications of AZPRP conjugate gradient method with sufficient descent property, A note on memory-less SR1 and memory-less BFGS methods for large-scale unconstrained optimization, STUDYING THE BASIN OF CONVERGENCE OF METHODS FOR COMPUTING PERIODIC ORBITS, Sparse-grid, reduced-basis Bayesian inversion: nonaffine-parametric nonlinear equations, Optimal control of bioprocess systems using hybrid numerical optimization algorithms, An outlier-resistant \(\kappa\)-generalized approach for robust physical parameter estimation, Efficent line search algorithm for unconstrained optimization, LMBOPT: a limited memory method for bound-constrained optimization, A NOTE ON THE CONVERGENCE OF THE DFP ALGORITHM ON QUADRATIC UNIFORMLY CONVEX FUNCTIONS, New hybrid conjugate gradient method as a convex combination of LS and FR methods, Maintaining the positive definiteness of the matrices in reduced secant methods for equality constrained optimization, Linearly convergent descent methods for the unconstrained minimization of convex quadratic splines, Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization, A diagonal quasi-Newton updating method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization, An optimal control framework for dynamic induction control of wind farms and their interaction with the atmospheric boundary layer, Adaptive three-term PRP algorithms without gradient Lipschitz continuity condition for nonconvex functions, On efficiently combining limited-memory and trust-region techniques, An active set trust-region method for bound-constrained optimization, A recalling-enhanced recurrent neural network: conjugate gradient learning algorithm and its convergence analysis, Constrained optimal control of switched systems based on modified BFGS algorithm and filled function method, New three-term conjugate gradient method with guaranteed global convergence, Stochastic quasi-Newton with line-search regularisation, Constrained neural network training and its application to hyperelastic material modeling, A decent three term conjugate gradient method with global convergence properties for large scale unconstrained optimization problems, Diagonal approximation of the Hessian by finite differences for unconstrained optimization, A diagonal quasi-Newton updating method for unconstrained optimization, Descentwise inexact proximal algorithms for smooth optimization, A link between the steepest descent method and fixed-point iterations, New conjugate gradient method for unconstrained optimization, New conjugate gradient algorithms based on self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method, A double parameter self-scaling memoryless BFGS method for unconstrained optimization, A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods, A New Adaptive Conjugate Gradient Algorithm for Large-Scale Unconstrained Optimization, Stochastic global optimization methods part I: Clustering methods, A scaled nonlinear conjugate gradient algorithm for unconstrained optimization, A modified nonlinear Polak-Ribière-Polyak conjugate gradient method with sufficient descent property, On the sufficient descent property of the Shanno's conjugate gradient method, Some modified Yabe–Takano conjugate gradient methods with sufficient descent condition, A modified secant equation quasi-Newton method for unconstrained optimization, GLOBAL CONVERGENCE OF A SPECIAL CASE OF THE DAI–YUAN FAMILY WITHOUT LINE SEARCH, A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization, Semi-discrete optimal transport: a solution procedure for the unsquared Euclidean distance case, A Modified Nonmonotone Hestenes–Stiefel Type Conjugate Gradient Methods for Large-Scale Unconstrained Problems, A survey of gradient methods for solving nonlinear optimization, Spectral conjugate gradient methods with sufficient descent property for large-scale unconstrained optimization, A class of gradient unconstrained minimization algorithms with adaptive stepsize, An extension of Curry's theorem to steepest descent in normed linear spaces, A new gradient method via least change secant update, Global convergence of a modified BFGS-type method for unconstrained non-convex minimization, A two-dimensional search used with a nonlinear least-squares solver, Convergence conditions for restarted conjugate gradient methods with inaccurate line searches, Backtracking gradient descent method and some applications in large scale optimisation. II: Algorithms and experiments, Practical convergence conditions for the Davidon-Fletcher-Powell method, An adaptive three-term conjugate gradient method with sufficient descent condition and conjugacy condition, An efficient solution scheme for small-strain crystal-elasto-viscoplasticity in a dual framework, Hybridization of accelerated gradient descent method, Scaled conjugate gradient algorithms for unconstrained optimization, Self-adaptive inexact proximal point methods, Levenberg-Marquardt method for solving systems of absolute value equations, Line search methods with guaranteed asymptotical convergence to an improving local optimum of multimodal functions, Two descent hybrid conjugate gradient methods for optimization, On step-size estimation of line search methods, Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, Convergence of nonmonotone line search method, A hybridization of the Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods, ADAPTIVE ALGORITHMS FOR NEURAL NETWORK SUPERVISED LEARNING: A DETERMINISTIC OPTIMIZATION APPROACH, Hybrid Riemannian conjugate gradient methods with global convergence properties, An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion, Implementing and modifying Broyden class updates for large scale optimization, An efficient modified AZPRP conjugate gradient method for large-scale unconstrained optimization problem, On solving a special class of weakly nonlinear finite-difference systems, Some sufficient descent conjugate gradient methods and their global convergence, Bijective parameterization with free boundaries, The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices, A modified Hestenes-Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition, Global convergence of conjugate gradient method, Dynamic search trajectory methods for global optimization, Composing Scalable Nonlinear Algebraic Solvers, A sufficient descent Liu–Storey conjugate gradient method and its global convergence, Sufficient descent Riemannian conjugate gradient methods, Some three-term conjugate gradient methods with the new direction structure, A class of globally convergent three-term Dai-Liao conjugate gradient methods, A sufficient descent conjugate gradient method and its global convergence, Computer Algebra and Line Search, Projection onto a Polyhedron that Exploits Sparsity, Simultaneous reconstruction of the perfusion coefficient and initial temperature from time-average integral temperature measurements, Scaled memoryless BFGS preconditioned steepest descent method for very large-scale unconstrained optimization, Optimization approach for the Monge-Ampère equation, A new accelerated conjugate gradient method for large-scale unconstrained optimization, A nonmonotone line search method and its convergence for unconstrained optimization, A hybrid-line-and-curve search globalization technique for inexact Newton methods, A note on a sufficient-decrease criterion for a non-derivative step-length procedure, Pseudospectral methods and iterative solvers for optimization problems from multiscale particle dynamics, A self-adjusting conjugate gradient method with sufficient descent condition and conjugacy condition, Some new three-term Hestenes–Stiefel conjugate gradient methods with affine combination, A new non-linear conjugate gradient algorithm for destructive cure rate model and a simulation study: illustration with negative binomial competing risks, New nonlinear conjugate gradient methods based on optimal Dai-Liao parameters, Variational methods for finding periodic orbits in the incompressible Navier–Stokes equations, Hybridization rule applied on accelerated double step size optimization scheme, Optimizing Oblique Projections for Nonlinear Systems using Trajectories, Recent advances in unconstrained optimization, Unnamed Item, Global convergence of the gradient method for functions definable in o-minimal structures, An Efficient and Robust Scalar Auxialiary Variable Based Algorithm for Discrete Gradient Systems Arising from Optimizations, Two diagonal conjugate gradient like methods for unconstrained optimization, Normalized Wolfe-Powell-type local minimax method for finding multiple unstable solutions of nonlinear elliptic PDEs, Physically enhanced training for modeling rate-independent plasticity with feedforward neural networks, A modified four-term extension of the Dai-Liao conjugate gradient method, Optimization of unconstrained problems using a developed algorithm of spectral conjugate gradient method calculation, Direct energy minimization based on exponential transformation in density functional calculations of finite and extended systems, A three-term conjugate gradient algorithm with restart procedure to solve image restoration problems, Shape optimizationviaa levelset and a Gauss-Newton method, Modified globally convergent Polak-Ribière-Polyak conjugate gradient methods with self-correcting property for large-scale unconstrained optimization, A novel iterative learning control scheme based on Broyden‐class optimization method, Swarm-based optimization with random descent, Adaptive type-2 neural fuzzy sliding mode control of a class of nonlinear systems, A new hybrid conjugate gradient algorithm based on the Newton direction to solve unconstrained optimization problems, Generalized RMIL conjugate gradient method under the strong Wolfe line search with application in image processing, On the estimation of destructive cure rate model: A new study with exponentially weighted Poisson competing risks, A robust BFGS algorithm for unconstrained nonlinear optimization problems, Two methods for the implicit integration of stiff reaction systems, Robust regression against heavy heterogeneous contamination, Theoretical Foundation of the Stretch Energy Minimization for Area-Preserving Simplicial Mappings, Solving Unconstrained Optimization Problems with Some Three-term Conjugate Gradient Methods, Practical convergence conditions for unconstrained optimization, Convergence analysis of the self-dual optimally conditioned ssvm method of oren-spedicato, A New Algorithm To Solve Calculus Of Variations Problems Using Wolef's Convergence Theory,Part 1:Theory And Algorithm, A New Diagonal Quasi-Newton Updating Method With Scaled Forward Finite Differences Directional Derivative for Unconstrained Optimization, Global convergece of the bfgs algorithm with nonmonotone linesearchthis work is supported by national natural science foundation$ef:, An efficient adaptive three-term extension of the Hestenes–Stiefel conjugate gradient method, A New Dai-Liao Conjugate Gradient Method with Optimal Parameter Choice, Choice of a step-length in an almost everywhere differentiable (on every direction) (almost everywhere locally lipschitz) lower-semi-continuous minimization problem, A Structured Quasi-Newton Algorithm for Optimizing with Incomplete Hessian Information, A new accelerated diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization, On Conjugate Gradient Algorithms as Objects of Scientific Study, Oblique projections, Broyden restricted class and limited-memory quasi-Newton methods, Globally convergent inexact generalized Newton method for first-order differentiable optimization problems, Two fundamental convergence theorems for nonlinear conjugate gradient methods and their applications, Unnamed Item, A new nonmonotone line search technique for unconstrained optimization, A NUMERICAL STUDY OF CONJUGATE GRADIENT DIRECTIONS FOR AN ULTRASOUND INVERSE PROBLEM, A descent family of Dai–Liao conjugate gradient methods, Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, Some descent three-term conjugate gradient methods and their global convergence, Modified Hestenes-Steifel conjugate gradient coefficient for unconstrained optimization, On the nonmonotone line search, An accelerated conjugate gradient algorithm with guaranteed descent and conjugacy conditions for unconstrained optimization, Accelerated gradient descent methods with line search, New hybrid conjugate gradient method as a convex combination of LS and CD methods, On the convergence of sequential minimization algorithms, Analysis of the gradient method with an Armijo–Wolfe line search on a class of non-smooth convex functions, A spectral KRMI conjugate gradient method under the strong-Wolfe line search, Variational Image Regularization with Euler's Elastica Using a Discrete Gradient Scheme, Probabilistic Line Searches for Stochastic Optimization, Unnamed Item, Two hybrid nonlinear conjugate gradient methods based on a modified secant equation, A new two-parameter family of nonlinear conjugate gradient methods, Descent Property and Global Convergence of a New Search Direction Method for Unconstrained Optimization, On Matrix Nearness Problems: Distance to Delocalization, A NEW THREE–TERM CONJUGATE GRADIENT METHOD WITH DESCENT DIRECTION FOR UNCONSTRAINED OPTIMIZATION, A one-parameter class of three-term conjugate gradient methods with an adaptive parameter choice, New hyrid conjugate gradient method as a convex combination of HZ and CD methods, Comments on ”New hybrid conjugate gradient method as a convex combination of FR and PRP methods”, A new hybrid conjugate gradient method of unconstrained optimization methods, A new spectral conjugate gradient method for large-scale unconstrained optimization, The higher-order Levenberg–Marquardt method with Armijo type line search for nonlinear equations, Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimization, Convergence of quasi-Newton method with new inexact line search, From linear to nonlinear iterative methods, A robust descent type algorithm for geophysical inversion through adaptive regularization, An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property, Some modified conjugate gradient methods for unconstrained optimization, Spectral method and its application to the conjugate gradient method, An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition, Modifying the BFGS update by a new column scaling technique, Useful redundancy in parameter and time delay estimation for continuous-time models, On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients, New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems, A modified Wei-Yao-Liu conjugate gradient method for unconstrained optimization, New inexact line search method for unconstrained optimization, A constrained optimization approach to solving certain systems of convex equations, Global convergence properties of the two new dependent Fletcher-Reeves conjugate gradient methods, A note on the Morozov principle via Lagrange duality, Convergence of the Polak-Ribiére-Polyak conjugate gradient method, A class of one parameter conjugate gradient methods, An efficient hybrid conjugate gradient method with the strong Wolfe-Powell line search, A new conjugate gradient algorithm with sufficient descent property for unconstrained optimization, Global convergence properties of two modified BFGS-type methods, A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization, New conjugacy condition and related new conjugate gradient methods for unconstrained optimization, A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, Shape optimization of continua using NURBS as basis functions, Shape optimization for a link mechanism, Modifications of the Wolfe line search rules to satisfy second-order optimality conditions in unconstrained optimization, Another conjugate gradient algorithm with guaranteed descent and conjugacy conditions for large-scale unconstrained optimization, A double parameter scaled BFGS method for unconstrained optimization, An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix, Convergence and stability of line search methods for unconstrained optimization, New cautious BFGS algorithm based on modified Armijo-type line search, A descent nonlinear conjugate gradient method for large-scale unconstrained optimization, A new class of nonlinear conjugate gradient coefficients with global convergence properties, Two modified scaled nonlinear conjugate gradient methods, Stepsize analysis for descent methods, On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae, A generalized direct search acceptable-point technique for use with descent-type multivariate algorithms, A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties, On the nonmonotonicity degree of nonmonotone line searches, An accelerated double step size model in unconstrained optimization, Smoothed \(\ell_1\)-regularization-based line search for sparse signal recovery, Global convergence of algorithms with nonmonotone line search strategy in unconstrained optimization, Some numerical experiments with variable-storage quasi-Newton algorithms, A tolerant algorithm for linearly constrained optimization calculations, A new algorithm for box-constrained global optimization, A note on the global convergence theorem of the scaled conjugate gradient algorithms proposed by Andrei, An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization, Accelerated double direction method for solving unconstrained optimization problems, A novel fractional Tikhonov regularization coupled with an improved super-memory gradient method and application to dynamic force identification problems, A sufficient descent LS conjugate gradient method for unconstrained optimization problems, The hybrid BFGS-CG method in solving unconstrained optimization problems, Convergence rate of descent method with new inexact line-search on Riemannian manifolds, A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches, New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction, Cubic regularization in symmetric rank-1 quasi-Newton methods, Some nonlinear conjugate gradient methods with sufficient descent condition and global convergence, Convergence of PRP method with new nonmonotone line search, A modified BFGS method and its superlinear convergence in nonconvex minimization with general line search rule, CGRS -- an advanced hybrid method for global optimization of continuous functions closely coupling extended random search and conjugate gradient method, An adaptive scaled BFGS method for unconstrained optimization, Analysis of a self-scaling quasi-Newton method, Modified nonmonotone Armijo line search for descent method, A modified CG-DESCENT method for unconstrained optimization, A robust implementation of a sequential quadratic programming algorithm with successive error restoration, Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization, A new globalization technique for nonlinear conjugate gradient methods for nonconvex minimization, Solution of eigenvalue problems in Hilbert spaces by a gradient method, Convergence proof of minimization algorithms for nonconvex functions, Approximation methods for the unconstrained optimization, New accelerated conjugate gradient algorithms as a modification of Dai-Yuan's computational scheme for unconstrained optimization, Scaled memoryless symmetric rank one method for large-scale optimization, Fully implicit simulation of polymer flooding with MRST, On obtaining optimal well rates and placement for CO\(_2\) storage, Some three-term conjugate gradient methods with the inexact line search condition, Numerically stable computation of step-sizes for descent methods. The nonconvex case, On three-term conjugate gradient algorithms for unconstrained optimization, A new three-term conjugate gradient algorithm for unconstrained optimization, Effiziente Schrittweitenfunktionen bei unrestringierten Optimierungsaufgaben, A new Liu-Storey type nonlinear conjugate gradient method for unconstrained optimization problems, Line search fixed point algorithms based on nonlinear conjugate gradient directions: application to constrained smooth convex optimization, A globally and quadratically convergent algorithm with efficient implementation for unconstrained optimization, Hybrid conjugate gradient method for a convex optimization problem over the fixed-point set of a nonexpansive mapping, The revised DFP algorithm without exact line search, A comparison of nonlinear optimization methods for supervised learning in multilayer feedforward neural networks, A globalization procedure for solving nonlinear systems of equations, A new three-term conjugate gradient method, Acceleration of conjugate gradient algorithms for unconstrained optimization, Pseudorandom lattices for global optimization, A truncated descent HS conjugate gradient method and its global convergence, Convergence properties of the Beale-Powell restart algorithm, Convergence of implementable descent algorithms for unconstrained optimization, MERLIN-3. 0. A multidimensional optimization environment, Optimization of upper semidifferentiable functions, An efficient line search for nonlinear least squares, On diagonally-preconditioning the 2-step BFGS method with accumulated steps for linearly constrained nonlinear programming, Direct search methods: Then and now, On diagonally preconditioning the truncated Newton method for super-scale linearly constrained nonlinear prrogramming, Convergence of descent method with new line search, Variable metric random pursuit