A modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization

From MaRDI portal
Publication:2346397

DOI10.1007/s10957-014-0528-4zbMath1319.49042OpenAlexW2031891766MaRDI QIDQ2346397

Yu-Hong Dai, CaiXia Kou

Publication date: 1 June 2015

Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1007/s10957-014-0528-4




Related Items

A descent hybrid conjugate gradient method based on the memoryless BFGS updateA modified scaling parameter for the memoryless BFGS updating formulaAn efficient hybrid conjugate gradient method with sufficient descent property for unconstrained optimizationA new family of hybrid three-term conjugate gradient methods with applications in image restorationTwo families of self-adjusting spectral hybrid DL conjugate gradient methods and applications in image denoisingA family of hybrid conjugate gradient method with restart procedure for unconstrained optimizations and image restorationsAn inertial spectral CG projection method based on the memoryless BFGS updateTwo families of hybrid conjugate gradient methods with restart procedures and their applicationsGlobal convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functionsA three-term conjugate gradient algorithm with restart procedure to solve image restoration problemsMemoryless quasi-Newton methods based on the spectral-scaling Broyden family for Riemannian optimizationAn adaptive three-term conjugate gradient method with sufficient descent condition and conjugacy conditionAn improved Polak-Ribière-Polyak conjugate gradient method with an efficient restart directionImproved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line searchInexact proximal memoryless quasi-Newton methods based on the Broyden family for minimizing composite functionsUnnamed ItemA spectral three-term Hestenes-Stiefel conjugate gradient methodAn efficient Dai-Liao type conjugate gradient method by reformulating the CG parameter in the search direction equationA new conjugate gradient method with an efficient memory structureTwo improved nonlinear conjugate gradient methods with the strong Wolfe line searchA three-term conjugate gradient method with accelerated subspace quadratic optimizationTwo classes of spectral conjugate gradient methods for unconstrained optimizationsA self-adjusting conjugate gradient method with sufficient descent condition and conjugacy conditionTheoretical characteristics and numerical methods for a class of special piecewise quadratic optimization


Uses Software


Cites Work