Convergence Conditions for Ascent Methods. II: Some Corrections
From MaRDI portal
Publication:5619730
DOI10.1137/1013035zbMath0216.26901OpenAlexW2012960907WikidataQ56560320 ScholiaQ56560320MaRDI QIDQ5619730
Publication date: 1971
Published in: SIAM Review (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/1013035
Related Items
Convergence of quasi-Newton method with new inexact line search, From linear to nonlinear iterative methods, A robust descent type algorithm for geophysical inversion through adaptive regularization, An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property, Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update, Gaussian process regression for maximum entropy distribution, Accelerated memory-less SR1 method with generalized secant equation for unconstrained optimization, Some modified conjugate gradient methods for unconstrained optimization, Spectral method and its application to the conjugate gradient method, Two efficient modifications of AZPRP conjugate gradient method with sufficient descent property, A note on memory-less SR1 and memory-less BFGS methods for large-scale unconstrained optimization, STUDYING THE BASIN OF CONVERGENCE OF METHODS FOR COMPUTING PERIODIC ORBITS, Optimal control of bioprocess systems using hybrid numerical optimization algorithms, On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients, New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems, New inexact line search method for unconstrained optimization, A constrained optimization approach to solving certain systems of convex equations, Global convergence properties of the two new dependent Fletcher-Reeves conjugate gradient methods, Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization, A subclass of generating set search with convergence to second-order stationary points, Adaptive three-term PRP algorithms without gradient Lipschitz continuity condition for nonconvex functions, An efficient hybrid conjugate gradient method with the strong Wolfe-Powell line search, A new conjugate gradient algorithm with sufficient descent property for unconstrained optimization, A recalling-enhanced recurrent neural network: conjugate gradient learning algorithm and its convergence analysis, Constrained optimal control of switched systems based on modified BFGS algorithm and filled function method, A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization, New conjugacy condition and related new conjugate gradient methods for unconstrained optimization, Stochastic quasi-Newton with line-search regularisation, A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, Constrained neural network training and its application to hyperelastic material modeling, Modifications of the Wolfe line search rules to satisfy second-order optimality conditions in unconstrained optimization, A double parameter scaled BFGS method for unconstrained optimization, An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix, Convergence and stability of line search methods for unconstrained optimization, A decent three term conjugate gradient method with global convergence properties for large scale unconstrained optimization problems, Two modified scaled nonlinear conjugate gradient methods, A modified conjugate gradient method based on the self-scaling memoryless BFGS update, Diagonal approximation of the Hessian by finite differences for unconstrained optimization, A diagonal quasi-Newton updating method for unconstrained optimization, A link between the steepest descent method and fixed-point iterations, New conjugate gradient method for unconstrained optimization, New conjugate gradient algorithms based on self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method, A double parameter self-scaling memoryless BFGS method for unconstrained optimization, A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods, A New Adaptive Conjugate Gradient Algorithm for Large-Scale Unconstrained Optimization, A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties, Assimilating data on the location of the free surface of a fluid flow to determine its viscosity, Smoothed \(\ell_1\)-regularization-based line search for sparse signal recovery, On the sufficient descent property of the Shanno's conjugate gradient method, Convergence of conjugate gradient methods with a closed-form stepsize formula, A note on the global convergence theorem of the scaled conjugate gradient algorithms proposed by Andrei, An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization, Semi-discrete optimal transport: a solution procedure for the unsquared Euclidean distance case, A sufficient descent LS conjugate gradient method for unconstrained optimization problems, The hybrid BFGS-CG method in solving unconstrained optimization problems, A class of gradient unconstrained minimization algorithms with adaptive stepsize, The global convergence of partitioned BFGS on problems with convex decompositions and Lipschitzian gradients, Cubic regularization in symmetric rank-1 quasi-Newton methods, An adaptive scaled BFGS method for unconstrained optimization, Scaled conjugate gradient algorithms for unconstrained optimization, Analysis of a self-scaling quasi-Newton method, Self-adaptive inexact proximal point methods, Modified nonmonotone Armijo line search for descent method, Improved sign-based learning algorithm derived by the composite nonlinear Jacobi process, Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, Convergence of nonmonotone line search method, Convergence proof of minimization algorithms for nonconvex functions, Hybrid Riemannian conjugate gradient methods with global convergence properties, New accelerated conjugate gradient algorithms as a modification of Dai-Yuan's computational scheme for unconstrained optimization, On obtaining optimal well rates and placement for CO\(_2\) storage, On three-term conjugate gradient algorithms for unconstrained optimization, A new three-term conjugate gradient algorithm for unconstrained optimization, An efficient modified AZPRP conjugate gradient method for large-scale unconstrained optimization problem, Some sufficient descent conjugate gradient methods and their global convergence, Line search fixed point algorithms based on nonlinear conjugate gradient directions: application to constrained smooth convex optimization, A new CG algorithm based on a scaled memoryless BFGS update with adaptive search strategy, and its application to large-scale unconstrained optimization problems, Spectral three-term constrained conjugate gradient algorithm for function minimizations, Dynamic search trajectory methods for global optimization, A sufficient descent Liu–Storey conjugate gradient method and its global convergence, Sufficient descent Riemannian conjugate gradient methods, A class of globally convergent three-term Dai-Liao conjugate gradient methods, A globally and quadratically convergent algorithm with efficient implementation for unconstrained optimization, Hybrid conjugate gradient method for a convex optimization problem over the fixed-point set of a nonexpansive mapping, The revised DFP algorithm without exact line search, Projection onto a Polyhedron that Exploits Sparsity, Simultaneous reconstruction of the perfusion coefficient and initial temperature from time-average integral temperature measurements, A new accelerated conjugate gradient method for large-scale unconstrained optimization, A \(q\)-Polak-Ribière-Polyak conjugate gradient algorithm for unconstrained optimization problems, A hybrid-line-and-curve search globalization technique for inexact Newton methods, A truncated descent HS conjugate gradient method and its global convergence, Convergence properties of the Beale-Powell restart algorithm, A new type of quasi-Newton updating formulas based on the new quasi-Newton equation, A modified bat algorithm with conjugate gradient method for global optimization, On convergence of minimization methods: Attraction, repulsion, and selection, A new hybrid algorithm for convex nonlinear unconstrained optimization, Adaptive machine learning-based surrogate modeling to accelerate PDE-constrained optimization in enhanced oil recovery, Pseudospectral methods and iterative solvers for optimization problems from multiscale particle dynamics, A CLASS OF DFP ALGORITHMS WITH REVISED SEARCH DIRECTION, Convergence of descent method with new line search, Variable metric random pursuit, A NOTE ON THE CONVERGENCE OF THE DFP ALGORITHM ON QUADRATIC UNIFORMLY CONVEX FUNCTIONS, Unnamed Item, A diagonal quasi-Newton updating method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization, An optimal control framework for dynamic induction control of wind farms and their interaction with the atmospheric boundary layer, Unnamed Item, Assimilation of boundary data for reconstructing the absorption coefficient in a model of stationary reaction-convection-diffusion, Normalized Wolfe-Powell-type local minimax method for finding multiple unstable solutions of nonlinear elliptic PDEs, Physically enhanced training for modeling rate-independent plasticity with feedforward neural networks, Direct energy minimization based on exponential transformation in density functional calculations of finite and extended systems, A three-term conjugate gradient algorithm with restart procedure to solve image restoration problems, A new hybrid conjugate gradient algorithm based on the Newton direction to solve unconstrained optimization problems, Generalized RMIL conjugate gradient method under the strong Wolfe line search with application in image processing, Stochastic global optimization methods part I: Clustering methods, A robust BFGS algorithm for unconstrained nonlinear optimization problems, Solving Unconstrained Optimization Problems with Some Three-term Conjugate Gradient Methods, An efficient new hybrid CG-method as convex combination of DY and CD and HS algorithms, A scaled nonlinear conjugate gradient algorithm for unconstrained optimization, Convergence analysis of the self-dual optimally conditioned ssvm method of oren-spedicato, A New Algorithm To Solve Calculus Of Variations Problems Using Wolef's Convergence Theory,Part 1:Theory And Algorithm, A New Diagonal Quasi-Newton Updating Method With Scaled Forward Finite Differences Directional Derivative for Unconstrained Optimization, An extension of Curry's theorem to steepest descent in normed linear spaces, Accurate Manycore-Accelerated Manifold Surface Remesh Kernels, A New Dai-Liao Conjugate Gradient Method with Optimal Parameter Choice, Choice of a step-length in an almost everywhere differentiable (on every direction) (almost everywhere locally lipschitz) lower-semi-continuous minimization problem, A Structured Quasi-Newton Algorithm for Optimizing with Incomplete Hessian Information, A new accelerated diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization, Oblique projections, Broyden restricted class and limited-memory quasi-Newton methods, Globally convergent inexact generalized Newton method for first-order differentiable optimization problems, A descent family of Dai–Liao conjugate gradient methods, Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, ADAPTIVE ALGORITHMS FOR NEURAL NETWORK SUPERVISED LEARNING: A DETERMINISTIC OPTIMIZATION APPROACH, An accelerated conjugate gradient algorithm with guaranteed descent and conjugacy conditions for unconstrained optimization, On solving a special class of weakly nonlinear finite-difference systems, On the use of directions of negative curvature in a modified newton method, A modification of classical conjugate gradient method using strong Wolfe line search, Unnamed Item, Two hybrid nonlinear conjugate gradient methods based on a modified secant equation, On Matrix Nearness Problems: Distance to Delocalization, A one-parameter class of three-term conjugate gradient methods with an adaptive parameter choice, A new spectral conjugate gradient method for large-scale unconstrained optimization, Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimization