Convergence Conditions for Ascent Methods
From MaRDI portal
Publication:5566849
DOI10.1137/1011036zbMath0177.20603OpenAlexW1988849934WikidataQ56560305 ScholiaQ56560305MaRDI QIDQ5566849
Publication date: 1969
Published in: SIAM Review (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/1011036
Related Items (only showing first 100 items - show all)
Some new three-term Hestenes–Stiefel conjugate gradient methods with affine combination ⋮ A new non-linear conjugate gradient algorithm for destructive cure rate model and a simulation study: illustration with negative binomial competing risks ⋮ New nonlinear conjugate gradient methods based on optimal Dai-Liao parameters ⋮ Variational methods for finding periodic orbits in the incompressible Navier–Stokes equations ⋮ Hybridization rule applied on accelerated double step size optimization scheme ⋮ Optimizing Oblique Projections for Nonlinear Systems using Trajectories ⋮ Recent advances in unconstrained optimization ⋮ Unnamed Item ⋮ Global convergence of the gradient method for functions definable in o-minimal structures ⋮ An Efficient and Robust Scalar Auxialiary Variable Based Algorithm for Discrete Gradient Systems Arising from Optimizations ⋮ Two diagonal conjugate gradient like methods for unconstrained optimization ⋮ Normalized Wolfe-Powell-type local minimax method for finding multiple unstable solutions of nonlinear elliptic PDEs ⋮ Physically enhanced training for modeling rate-independent plasticity with feedforward neural networks ⋮ A modified four-term extension of the Dai-Liao conjugate gradient method ⋮ Optimization of unconstrained problems using a developed algorithm of spectral conjugate gradient method calculation ⋮ Direct energy minimization based on exponential transformation in density functional calculations of finite and extended systems ⋮ A three-term conjugate gradient algorithm with restart procedure to solve image restoration problems ⋮ Shape optimizationviaa levelset and a Gauss-Newton method ⋮ Modified globally convergent Polak-Ribière-Polyak conjugate gradient methods with self-correcting property for large-scale unconstrained optimization ⋮ A novel iterative learning control scheme based on Broyden‐class optimization method ⋮ Swarm-based optimization with random descent ⋮ Adaptive type-2 neural fuzzy sliding mode control of a class of nonlinear systems ⋮ A new hybrid conjugate gradient algorithm based on the Newton direction to solve unconstrained optimization problems ⋮ Generalized RMIL conjugate gradient method under the strong Wolfe line search with application in image processing ⋮ On the estimation of destructive cure rate model: A new study with exponentially weighted Poisson competing risks ⋮ A robust BFGS algorithm for unconstrained nonlinear optimization problems ⋮ Two methods for the implicit integration of stiff reaction systems ⋮ Robust regression against heavy heterogeneous contamination ⋮ Theoretical Foundation of the Stretch Energy Minimization for Area-Preserving Simplicial Mappings ⋮ Solving Unconstrained Optimization Problems with Some Three-term Conjugate Gradient Methods ⋮ Practical convergence conditions for unconstrained optimization ⋮ Convergence analysis of the self-dual optimally conditioned ssvm method of oren-spedicato ⋮ A New Algorithm To Solve Calculus Of Variations Problems Using Wolef's Convergence Theory,Part 1:Theory And Algorithm ⋮ A New Diagonal Quasi-Newton Updating Method With Scaled Forward Finite Differences Directional Derivative for Unconstrained Optimization ⋮ Global convergece of the bfgs algorithm with nonmonotone linesearch∗∗this work is supported by national natural science foundation$ef: ⋮ An efficient adaptive three-term extension of the Hestenes–Stiefel conjugate gradient method ⋮ A New Dai-Liao Conjugate Gradient Method with Optimal Parameter Choice ⋮ Choice of a step-length in an almost everywhere differentiable (on every direction) (almost everywhere locally lipschitz) lower-semi-continuous minimization problem ⋮ A Structured Quasi-Newton Algorithm for Optimizing with Incomplete Hessian Information ⋮ A new accelerated diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization ⋮ On Conjugate Gradient Algorithms as Objects of Scientific Study ⋮ Oblique projections, Broyden restricted class and limited-memory quasi-Newton methods ⋮ Globally convergent inexact generalized Newton method for first-order differentiable optimization problems ⋮ Two fundamental convergence theorems for nonlinear conjugate gradient methods and their applications ⋮ Unnamed Item ⋮ A new nonmonotone line search technique for unconstrained optimization ⋮ A NUMERICAL STUDY OF CONJUGATE GRADIENT DIRECTIONS FOR AN ULTRASOUND INVERSE PROBLEM ⋮ A descent family of Dai–Liao conjugate gradient methods ⋮ Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization ⋮ Some descent three-term conjugate gradient methods and their global convergence ⋮ On quasi-Newton methods in fast Fourier transform-based micromechanics ⋮ Inverse problem for determining free parameters of a reduced turbulent transport model for tokamak plasma ⋮ Modified Hestenes-Steifel conjugate gradient coefficient for unconstrained optimization ⋮ On the nonmonotone line search ⋮ An accelerated conjugate gradient algorithm with guaranteed descent and conjugacy conditions for unconstrained optimization ⋮ Accelerated gradient descent methods with line search ⋮ New hybrid conjugate gradient method as a convex combination of LS and CD methods ⋮ On the convergence of sequential minimization algorithms ⋮ Analysis of the gradient method with an Armijo–Wolfe line search on a class of non-smooth convex functions ⋮ A spectral KRMI conjugate gradient method under the strong-Wolfe line search ⋮ Variational Image Regularization with Euler's Elastica Using a Discrete Gradient Scheme ⋮ Probabilistic Line Searches for Stochastic Optimization ⋮ Unnamed Item ⋮ Two hybrid nonlinear conjugate gradient methods based on a modified secant equation ⋮ A new two-parameter family of nonlinear conjugate gradient methods ⋮ Descent Property and Global Convergence of a New Search Direction Method for Unconstrained Optimization ⋮ On Matrix Nearness Problems: Distance to Delocalization ⋮ A NEW THREE–TERM CONJUGATE GRADIENT METHOD WITH DESCENT DIRECTION FOR UNCONSTRAINED OPTIMIZATION ⋮ A one-parameter class of three-term conjugate gradient methods with an adaptive parameter choice ⋮ New hyrid conjugate gradient method as a convex combination of HZ and CD methods ⋮ Comments on ”New hybrid conjugate gradient method as a convex combination of FR and PRP methods” ⋮ A new hybrid conjugate gradient method of unconstrained optimization methods ⋮ A new spectral conjugate gradient method for large-scale unconstrained optimization ⋮ The higher-order Levenberg–Marquardt method with Armijo type line search for nonlinear equations ⋮ Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimization ⋮ Convergence of quasi-Newton method with new inexact line search ⋮ From linear to nonlinear iterative methods ⋮ A robust descent type algorithm for geophysical inversion through adaptive regularization ⋮ An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property ⋮ Some modified conjugate gradient methods for unconstrained optimization ⋮ Spectral method and its application to the conjugate gradient method ⋮ An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition ⋮ Modifying the BFGS update by a new column scaling technique ⋮ Useful redundancy in parameter and time delay estimation for continuous-time models ⋮ On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients ⋮ New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems ⋮ A modified Wei-Yao-Liu conjugate gradient method for unconstrained optimization ⋮ New inexact line search method for unconstrained optimization ⋮ A constrained optimization approach to solving certain systems of convex equations ⋮ Global convergence properties of the two new dependent Fletcher-Reeves conjugate gradient methods ⋮ A note on the Morozov principle via Lagrange duality ⋮ Convergence of the Polak-Ribiére-Polyak conjugate gradient method ⋮ A class of one parameter conjugate gradient methods ⋮ An efficient hybrid conjugate gradient method with the strong Wolfe-Powell line search ⋮ A new conjugate gradient algorithm with sufficient descent property for unconstrained optimization ⋮ Global convergence properties of two modified BFGS-type methods ⋮ A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization ⋮ New conjugacy condition and related new conjugate gradient methods for unconstrained optimization ⋮ A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization ⋮ Shape optimization of continua using NURBS as basis functions
This page was built for publication: Convergence Conditions for Ascent Methods