New quasi-Newton equation and related methods for unconstrained optimization

From MaRDI portal
Publication:1306664

DOI10.1023/A:1021898630001zbMath0991.90135OpenAlexW186175500MaRDI QIDQ1306664

Nai-Yang Deng, Zhang, Jianzhong, Liang-Ho Chen

Publication date: 5 October 1999

Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1023/a:1021898630001




Related Items (only showing first 100 items - show all)

Unnamed ItemA limited memory BFGS algorithm for non-convex minimization with applications in matrix largest eigenvalue problemA modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problemsA modified scaling parameter for the memoryless BFGS updating formulaA class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS updateLocal and superlinear convergence of quasi-Newton methods based on modified secant conditionsA family of three-term nonlinear conjugate gradient methods close to the memoryless BFGS methodScaling damped limited-memory updates for unconstrained optimizationSolving nonlinear monotone operator equations via modified SR1 updateNew nonlinear conjugate gradient methods based on optimal Dai-Liao parametersA new class of efficient and globally convergent conjugate gradient methods in the Dai–Liao familySufficient descent nonlinear conjugate gradient methods with conjugacy conditionA modified conjugate gradient method based on a modified secant equationA new adaptive Barzilai and Borwein method for unconstrained optimizationGlobal convergence of a memory gradient method for unconstrained optimizationNonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty modelA compact limited memory method for large scale unconstrained optimizationA modified Dai-Kou-type method with applications to signal reconstruction and blurred image restorationA hybrid quasi-Newton method with application in sparse recoveryA modified nonmonotone BFGS algorithm for unconstrained optimizationThe global convergence of a modified BFGS method for nonconvex functionsScaling on the spectral gradient methodA double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimizationA modified quasi-Newton method for nonlinear equationsSeveral efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimizationGlobal convergence of a modified limited memory BFGS method for non-convex minimizationOn optimality of the parameters of self-scaling memoryless quasi-Newton updating formulaeA modified conjugate gradient parameter via hybridization approach for solving large-scale systems of nonlinear equationsA Dai-Liao conjugate gradient method via modified secant equation for system of nonlinear equationsStructured two-point stepsize gradient methods for nonlinear least squaresAn efficient gradient method with approximately optimal stepsize based on tensor model for unconstrained optimizationA class of accelerated conjugate-gradient-like methods based on a modified secant equationGlobal convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functionsA descent Dai-Liao conjugate gradient method based on a modified secant equation and its global convergenceNew DY-HS hybrid conjugate gradient algorithm for solving optimization problem of unsteady partial differential equations with convection termA new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problemsThe projection technique for two open problems of unconstrained optimization problemsA modified stochastic quasi-Newton algorithm for summing functions problem in machine learningA hybrid BB-type method for solving large scale unconstrained optimizationCompetitive secant (BFGS) methods based on modified secant relations for unconstrained optimizationA new conjugate gradient algorithm for training neural networks based on a modified secant equationA modified Newton-like method for nonlinear equationsA new subspace minimization conjugate gradient method based on modified secant equation for unconstrained optimizationSome modified Yabe–Takano conjugate gradient methods with sufficient descent conditionDescent Perry conjugate gradient methods for systems of monotone nonlinear equationsGlobally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimizationAn active set limited memory BFGS algorithm for bound constrained optimizationConjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimizationA modified secant equation quasi-Newton method for unconstrained optimizationTwo effective hybrid conjugate gradient algorithms based on modified BFGS updatesA modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimizationA combined class of self-scaling and modified quasi-Newton methodsA modified conjugacy condition and related nonlinear conjugate gradient methodQuasi-Newton methods for multiobjective optimization problemsAn improved nonlinear conjugate gradient method with an optimal propertyA modified Hestense–Stiefel conjugate gradient method close to the memoryless BFGS quasi-Newton methodAn adaptive three-term conjugate gradient method with sufficient descent condition and conjugacy conditionHigher order curvature information and its application in a modified diagonal Secant methodMulti-step nonlinear conjugate gradient methods for unconstrained minimizationOn the Local and Superlinear Convergence of a Parameterized DFP MethodAn efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimizationA new modified BFGS method for unconstrained optimization problemsAnother hybrid conjugate gradient algorithm for unconstrained optimizationNonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problemsConvergence analysis of an improved BFGS method and its application in the Muskingum modelAccelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimizationTwo new conjugate gradient methods based on modified secant equationsA descent family of Dai–Liao conjugate gradient methodsAccelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimizationNotes on the Dai-Yuan-Yuan modified spectral gradient methodProperties and numerical performance of quasi-Newton methods with modified quasi-Newton equationsUsing nonlinear functions to approximate a new quasi-Newton method for unconstrained optimization problemsA family of Hager-Zhang conjugate gradient methods for system of monotone nonlinear equationsNew line search methods for unconstrained optimizationAn augmented memoryless BFGS method based on a modified secant equation with application to compressed sensingThe Dai-Liao nonlinear conjugate gradient method with optimal parameter choicesA modified Hestenes-Stiefel conjugate gradient method with sufficient descent condition and conjugacy conditionGlobal convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line searchA limited memory BFGS-type method for large-scale unconstrained optimizationA Conjugate Gradient Method Based on a Modified Secant Relation for Unconstrained OptimizationEnhanced Dai-Liao conjugate gradient methods for systems of monotone nonlinear equationsTwo descent Dai-Yuan conjugate gradient methods for systems of monotone nonlinear equationsA modified BFGS algorithm based on a hybrid secant equationNonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimizationA truncated descent HS conjugate gradient method and its global convergenceScaled nonlinear conjugate gradient methods for nonlinear least squares problemsTwo hybrid nonlinear conjugate gradient methods based on a modified secant equationA new type of quasi-Newton updating formulas based on the new quasi-Newton equationA new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective functionDiagonally scaled memoryless quasi-Newton methods with application to compressed sensingUnnamed ItemA nonlinear conjugate gradient method based on the MBFGS secant conditionA three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton methodVariable metric methods for unconstrained optimization and nonlinear least squaresA modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimizationA modified two-point stepsize gradient algorithm for unconstrained minimizationA CLASS OF MODIFIED BFGS METHODS WITH FUNCTION VALUE INFORMATION FOR UNCONSTRAINED OPTIMIZATIONA nonmonotone scaled conjugate gradient algorithm for large-scale unconstrained optimizationAn adaptive sizing BFGS method for unconstrained optimizationA modified Perry conjugate gradient method and its global convergence


Uses Software


Cites Work


This page was built for publication: New quasi-Newton equation and related methods for unconstrained optimization