A modified BFGS method and its global convergence in nonconvex minimization

From MaRDI portal
Publication:5936068

DOI10.1016/S0377-0427(00)00540-9zbMath0984.65055MaRDI QIDQ5936068

Masao Fukushima, Dong-hui Li

Publication date: 12 May 2002

Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)




Related Items

Multicategory Angle-Based Learning for Estimating Optimal Dynamic Treatment Regimes With Censored Data, IDENTIFICATION OF TEMPERATURE-DEPENDENT PARAMETERS IN LASER-INTERSTITIAL THERMO THERAPY, A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update, An improved diagonal Jacobian approximation via a new quasi-Cauchy condition for solving large-scale systems of nonlinear equations, Quasi-Newton methods for machine learning: forget the past, just sample, A Riemannian BFGS Method for Nonconvex Optimization Problems, New nonlinear conjugate gradient methods based on optimal Dai-Liao parameters, A modified conjugate gradient method based on a modified secant equation, Unnamed Item, A modified nonmonotone BFGS algorithm for unconstrained optimization, A Modified Non-Monotone BFGS Method for Non-Convex Unconstrained Optimization, The global convergence of a modified BFGS method for nonconvex functions, A new descent method for symmetric non-monotone variational inequalities with application to eigenvalue complementarity problems, Phase field modeling of brittle fracture in large-deformation solid shells with the efficient quasi-Newton solution and global-local approach, An accelerated descent CG algorithm with clustering the eigenvalues for large-scale nonconvex unconstrained optimization and its application in image restoration problems, A modified conjugate gradient parameter via hybridization approach for solving large-scale systems of nonlinear equations, An efficient modified residual-based algorithm for large scale symmetric nonlinear equations by approximating successive iterated gradients, A class of spectral three-term descent Hestenes-Stiefel conjugate gradient algorithms for large-scale unconstrained optimization and image restoration problems, An efficient gradient method with approximately optimal stepsize based on tensor model for unconstrained optimization, Regularization of limited memory quasi-Newton methods for large-scale nonconvex minimization, A statistical multivariable optimization method using improved orthogonal algorithm based on large data, A derivative-free line search technique for Broyden-like method with applications to NCP, wLCP and SI, On Stochastic and Deterministic Quasi-Newton Methods for Nonstrongly Convex Optimization: Asymptotic Convergence and Rate Analysis, An adaptive modified three-term conjugate gradient method with global convergence, A hybrid BB-type method for solving large scale unconstrained optimization, A hybrid conjugate gradient algorithm for nonconvex functions and its applications in image restoration problems, Inexact proximal DC Newton-type method for nonconvex composite functions, A class of new three-term descent conjugate gradient algorithms for large-scale unconstrained optimization and applications to image restoration problems, Memoryless quasi-Newton methods based on the spectral-scaling Broyden family for Riemannian optimization, A restart scheme for the memoryless BFGS method, Competitive secant (BFGS) methods based on modified secant relations for unconstrained optimization, An adaptive projection BFGS method for nonconvex unconstrained optimization problems, A proximal quasi-Newton method based on memoryless modified symmetric rank-one formula, A Riemannian BFGS Method Without Differentiated Retraction for Nonconvex Optimization Problems, Some modified Yabe–Takano conjugate gradient methods with sufficient descent condition, A modified secant equation quasi-Newton method for unconstrained optimization, A gradient projection method for the sparse signal reconstruction in compressive sensing, A new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective function, Unnamed Item, A modified Hestense–Stiefel conjugate gradient method close to the memoryless BFGS quasi-Newton method, Block BFGS Methods, A Modified PRP Conjugate Gradient Algorithm with Trust Region for Optimization Problems, On the Local and Superlinear Convergence of a Parameterized DFP Method, New BFGS method for unconstrained optimization problem based on modified Armijo line search, A descent family of Dai–Liao conjugate gradient methods, Nonsmooth equation based BFGS method for solving KKT systems in mathematical programming, New quasi-Newton methods for unconstrained optimization problems, Some descent three-term conjugate gradient methods and their global convergence, A globally convergent BFGS method for nonlinear monotone equations without any merit functions, New line search methods for unconstrained optimization, Some sufficient descent conjugate gradient methods and their global convergence, Inexact Newton and quasi-Newton methods for the output feedback pole assignment problem, The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices, A globally convergent BFGS method for nonconvex minimization without line searches, A Conjugate Gradient Method Based on a Modified Secant Relation for Unconstrained Optimization, A robust multi-batch L-BFGS method for machine learning, Two hybrid nonlinear conjugate gradient methods based on a modified secant equation, An adaptive nonmonotone trust region method based on a modified scalar approximation of the Hessian in the successive quadratic subproblems, Unnamed Item, A nonlinear conjugate gradient method based on the MBFGS secant condition, A Modified Hestenes-Stiefel Conjugate Gradient Algorithm for Large-Scale Optimization, Nonsmoothness and a variable metric method, A modified two-point stepsize gradient algorithm for unconstrained minimization, An efficient implementation of a trust region method for box constrained optimization, A hybrid conjugate gradient method based on a quadratic relaxation of the Dai–Yuan hybrid conjugate gradient parameter, A CLASS OF MODIFIED BFGS METHODS WITH FUNCTION VALUE INFORMATION FOR UNCONSTRAINED OPTIMIZATION, A nonmonotone scaled conjugate gradient algorithm for large-scale unconstrained optimization, Two modified spectral conjugate gradient methods and their global convergence for unconstrained optimization, A modified Perry conjugate gradient method and its global convergence, A limited memory quasi-Newton trust-region method for box constrained optimization, On Hager and Zhang's conjugate gradient method with guaranteed descent, A limited memory BFGS algorithm for non-convex minimization with applications in matrix largest eigenvalue problem, An efficient gradient-free projection algorithm for constrained nonlinear equations and image restoration, An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition, A modified scaling parameter for the memoryless BFGS updating formula, Scaling damped limited-memory updates for unconstrained optimization, A modified Dai-Liao conjugate gradient method for solving unconstrained optimization and image restoration problems, Two sufficient descent three-term conjugate gradient methods for unconstrained optimization problems with applications in compressive sensing, A hybrid conjugate gradient based approach for solving unconstrained optimization and motion control problems, Nonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty model, A modified Dai-Kou-type method with applications to signal reconstruction and blurred image restoration, A hybrid quasi-Newton method with application in sparse recovery, Global convergence of a nonmonotone Broyden family method for nonconvex unconstrained minimization, Global convergence of a modified Broyden family method for nonconvex functions, A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization, A double parameter scaled BFGS method for unconstrained optimization, An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix, Global convergence of a modified limited memory BFGS method for non-convex minimization, Two modified HS type conjugate gradient methods for unconstrained optimization problems, On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae, A Dai-Liao conjugate gradient method via modified secant equation for system of nonlinear equations, A class of accelerated conjugate-gradient-like methods based on a modified secant equation, Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functions, A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods, Nonmonotone spectral method for large-scale symmetric nonlinear equations, A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems, The projection technique for two open problems of unconstrained optimization problems, The convergence of a new modified BFGS method without line searches for unconstrained optimization or complexity systems, Improved Hessian approximation with modified secant equations for symmetric rank-one method, Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization, An active set limited memory BFGS algorithm for bound constrained optimization, Globally convergent modified Perry's conjugate gradient method, New nonsmooth equations-based algorithms for \(\ell_1\)-norm minimization and applications, Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization, Two effective hybrid conjugate gradient algorithms based on modified BFGS updates, A combined class of self-scaling and modified quasi-Newton methods, A limited-memory optimization method using the infinitely many times repeated BNS update and conjugate directions, Two modified three-term type conjugate gradient methods and their global convergence for unconstrained optimization, A survey of gradient methods for solving nonlinear optimization, The superlinear convergence analysis of a nonmonotone BFGS algorithm on convex objective functions, An improved nonlinear conjugate gradient method with an optimal property, Two modified three-term conjugate gradient methods with sufficient descent property, Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search, A partitioned PSB method for partially separable unconstrained optimization problems, A nonmonotone filter line search technique for the MBFGS method in unconstrained optimization, Design and analysis of two discrete-time ZD algorithms for time-varying nonlinear minimization, Using gradient directions to get global convergence of Newton-type methods, Global convergence of the nonmonotone MBFGS method for nonconvex unconstrained minimization, An adaptive scaled BFGS method for unconstrained optimization, A new modified BFGS method for unconstrained optimization problems, An improved adaptive trust-region algorithm, A regularized limited memory BFGS method for nonconvex unconstrained minimization, Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems, A new hybrid PRPFR conjugate gradient method for solving nonlinear monotone equations and image restoration problems, A conjugate gradient method to solve convex constrained monotone equations with applications in compressive sensing, Convergence analysis of an improved BFGS method and its application in the Muskingum model, Global convergence of a modified two-parameter scaled BFGS method with Yuan-Wei-Lu line search for unconstrained optimization, Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length, A conjugate gradient algorithm for large-scale nonlinear equations and image restoration problems, Two new conjugate gradient methods based on modified secant equations, Some nonlinear conjugate gradient methods based on spectral scaling secant equations, A new globalization technique for nonlinear conjugate gradient methods for nonconvex minimization, A Newton-like trust region method for large-scale unconstrained nonconvex minimization, Convergence analysis of a modified BFGS method on convex minimizations, A brief survey of methods for solving nonlinear least-squares problems, Three-dimensional phase-field modeling of mode I + II/III failure in solids, Fracture of thermo-elastic solids: phase-field modeling and new results with an efficient monolithic solver, A variation of Broyden class methods using Householder adaptive transforms, Using nonlinear functions to approximate a new quasi-Newton method for unconstrained optimization problems, Inexact proximal memoryless quasi-Newton methods based on the Broyden family for minimizing composite functions, A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique, An augmented memoryless BFGS method based on a modified secant equation with application to compressed sensing, Semi-parametric estimation of multivariate extreme expectiles, Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search, A spectral three-term Hestenes-Stiefel conjugate gradient method, A limited memory BFGS-type method for large-scale unconstrained optimization, An efficient Dai-Liao type conjugate gradient method by reformulating the CG parameter in the search direction equation, New investigation for the Liu-Story scaled conjugate gradient method for nonlinear optimization, Two modified Dai-Yuan nonlinear conjugate gradient methods, A conjugate gradient method with sufficient descent property, A limited memory \(q\)-BFGS algorithm for unconstrained optimization problems, Two limited-memory optimization methods with minimum violation of the previous secant conditions, A generalised phase field model for fatigue crack growth in elastic-plastic solids with an efficient monolithic solver, Adaptive scaling damped BFGS method without gradient Lipschitz continuity, A globally convergent BFGS method with nonmonotone line search for non-convex minimization, A modified BFGS algorithm based on a hybrid secant equation, An improved nonmonotone adaptive trust region method., A new accelerated conjugate gradient method for large-scale unconstrained optimization, Solving unconstrained optimization problems via hybrid CD-DY conjugate gradient methods with applications, A new backtracking inexact BFGS method for symmetric nonlinear equations, A modified spectral conjugate gradient method with global convergence, A modified PRP-type conjugate gradient projection algorithm for solving large-scale monotone nonlinear equations with convex constraint, A globally convergent BFGS method for symmetric nonlinear equations, A new type of quasi-Newton updating formulas based on the new quasi-Newton equation, A new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective function, Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing, A modified BFGS type quasi-Newton method with line search for symmetric nonlinear equations problems, Fast multivariate log-concave density estimation, Diagonal BFGS updates and applications to the limited memory BFGS method


Uses Software


Cites Work