New quasi-Newton equation and related methods for unconstrained optimization
From MaRDI portal
Publication:1306664
DOI10.1023/A:1021898630001zbMath0991.90135OpenAlexW186175500MaRDI QIDQ1306664
Nai-Yang Deng, Zhang, Jianzhong, Liang-Ho Chen
Publication date: 5 October 1999
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1023/a:1021898630001
Related Items (only showing first 100 items - show all)
Unnamed Item ⋮ A limited memory BFGS algorithm for non-convex minimization with applications in matrix largest eigenvalue problem ⋮ A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems ⋮ A modified scaling parameter for the memoryless BFGS updating formula ⋮ A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update ⋮ Local and superlinear convergence of quasi-Newton methods based on modified secant conditions ⋮ A family of three-term nonlinear conjugate gradient methods close to the memoryless BFGS method ⋮ Scaling damped limited-memory updates for unconstrained optimization ⋮ Solving nonlinear monotone operator equations via modified SR1 update ⋮ New nonlinear conjugate gradient methods based on optimal Dai-Liao parameters ⋮ A new class of efficient and globally convergent conjugate gradient methods in the Dai–Liao family ⋮ Sufficient descent nonlinear conjugate gradient methods with conjugacy condition ⋮ A modified conjugate gradient method based on a modified secant equation ⋮ A new adaptive Barzilai and Borwein method for unconstrained optimization ⋮ Global convergence of a memory gradient method for unconstrained optimization ⋮ Nonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty model ⋮ A compact limited memory method for large scale unconstrained optimization ⋮ A modified Dai-Kou-type method with applications to signal reconstruction and blurred image restoration ⋮ A hybrid quasi-Newton method with application in sparse recovery ⋮ A modified nonmonotone BFGS algorithm for unconstrained optimization ⋮ The global convergence of a modified BFGS method for nonconvex functions ⋮ Scaling on the spectral gradient method ⋮ A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization ⋮ A modified quasi-Newton method for nonlinear equations ⋮ Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization ⋮ Global convergence of a modified limited memory BFGS method for non-convex minimization ⋮ On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae ⋮ A modified conjugate gradient parameter via hybridization approach for solving large-scale systems of nonlinear equations ⋮ A Dai-Liao conjugate gradient method via modified secant equation for system of nonlinear equations ⋮ Structured two-point stepsize gradient methods for nonlinear least squares ⋮ An efficient gradient method with approximately optimal stepsize based on tensor model for unconstrained optimization ⋮ A class of accelerated conjugate-gradient-like methods based on a modified secant equation ⋮ Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functions ⋮ A descent Dai-Liao conjugate gradient method based on a modified secant equation and its global convergence ⋮ New DY-HS hybrid conjugate gradient algorithm for solving optimization problem of unsteady partial differential equations with convection term ⋮ A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems ⋮ The projection technique for two open problems of unconstrained optimization problems ⋮ A modified stochastic quasi-Newton algorithm for summing functions problem in machine learning ⋮ A hybrid BB-type method for solving large scale unconstrained optimization ⋮ Competitive secant (BFGS) methods based on modified secant relations for unconstrained optimization ⋮ A new conjugate gradient algorithm for training neural networks based on a modified secant equation ⋮ A modified Newton-like method for nonlinear equations ⋮ A new subspace minimization conjugate gradient method based on modified secant equation for unconstrained optimization ⋮ Some modified Yabe–Takano conjugate gradient methods with sufficient descent condition ⋮ Descent Perry conjugate gradient methods for systems of monotone nonlinear equations ⋮ Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization ⋮ An active set limited memory BFGS algorithm for bound constrained optimization ⋮ Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization ⋮ A modified secant equation quasi-Newton method for unconstrained optimization ⋮ Two effective hybrid conjugate gradient algorithms based on modified BFGS updates ⋮ A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization ⋮ A combined class of self-scaling and modified quasi-Newton methods ⋮ A modified conjugacy condition and related nonlinear conjugate gradient method ⋮ Quasi-Newton methods for multiobjective optimization problems ⋮ An improved nonlinear conjugate gradient method with an optimal property ⋮ A modified Hestense–Stiefel conjugate gradient method close to the memoryless BFGS quasi-Newton method ⋮ An adaptive three-term conjugate gradient method with sufficient descent condition and conjugacy condition ⋮ Higher order curvature information and its application in a modified diagonal Secant method ⋮ Multi-step nonlinear conjugate gradient methods for unconstrained minimization ⋮ On the Local and Superlinear Convergence of a Parameterized DFP Method ⋮ An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization ⋮ A new modified BFGS method for unconstrained optimization problems ⋮ Another hybrid conjugate gradient algorithm for unconstrained optimization ⋮ Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems ⋮ Convergence analysis of an improved BFGS method and its application in the Muskingum model ⋮ Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization ⋮ Two new conjugate gradient methods based on modified secant equations ⋮ A descent family of Dai–Liao conjugate gradient methods ⋮ Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization ⋮ Notes on the Dai-Yuan-Yuan modified spectral gradient method ⋮ Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations ⋮ Using nonlinear functions to approximate a new quasi-Newton method for unconstrained optimization problems ⋮ A family of Hager-Zhang conjugate gradient methods for system of monotone nonlinear equations ⋮ New line search methods for unconstrained optimization ⋮ An augmented memoryless BFGS method based on a modified secant equation with application to compressed sensing ⋮ The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices ⋮ A modified Hestenes-Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition ⋮ Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search ⋮ A limited memory BFGS-type method for large-scale unconstrained optimization ⋮ A Conjugate Gradient Method Based on a Modified Secant Relation for Unconstrained Optimization ⋮ Enhanced Dai-Liao conjugate gradient methods for systems of monotone nonlinear equations ⋮ Two descent Dai-Yuan conjugate gradient methods for systems of monotone nonlinear equations ⋮ A modified BFGS algorithm based on a hybrid secant equation ⋮ Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization ⋮ A truncated descent HS conjugate gradient method and its global convergence ⋮ Scaled nonlinear conjugate gradient methods for nonlinear least squares problems ⋮ Two hybrid nonlinear conjugate gradient methods based on a modified secant equation ⋮ A new type of quasi-Newton updating formulas based on the new quasi-Newton equation ⋮ A new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective function ⋮ Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing ⋮ Unnamed Item ⋮ A nonlinear conjugate gradient method based on the MBFGS secant condition ⋮ A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method ⋮ Variable metric methods for unconstrained optimization and nonlinear least squares ⋮ A modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization ⋮ A modified two-point stepsize gradient algorithm for unconstrained minimization ⋮ A CLASS OF MODIFIED BFGS METHODS WITH FUNCTION VALUE INFORMATION FOR UNCONSTRAINED OPTIMIZATION ⋮ A nonmonotone scaled conjugate gradient algorithm for large-scale unconstrained optimization ⋮ An adaptive sizing BFGS method for unconstrained optimization ⋮ A modified Perry conjugate gradient method and its global convergence
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- A Modified BFGS Algorithm for Unconstrained Optimization
- Conic Approximations and Collinear Scalings for Optimizers
- The Q-Superlinear Convergence of a Collinear Scaling Algorithm for Unconstrained Optimization
- The Newton and Cauchy Perspectives on Computational Nonlinear Optimization
- On the Local and Superlinear Convergence of Quasi-Newton Methods
This page was built for publication: New quasi-Newton equation and related methods for unconstrained optimization