On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae

From MaRDI portal
Publication:887119

DOI10.1007/s10957-015-0724-xzbMath1327.90394OpenAlexW2042877555MaRDI QIDQ887119

Saman Babaie-Kafaki

Publication date: 28 October 2015

Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1007/s10957-015-0724-x




Related Items (28)

A hybrid scaling parameter for the scaled memoryless BFGS method based on the ℓ matrix normAn accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy conditionA modified scaling parameter for the memoryless BFGS updating formulaTwo accelerated nonmonotone adaptive trust region line search methodsA diagonal quasi-Newton updating method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimizationNonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty modelA hybrid quasi-Newton method with application in sparse recoveryA double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimizationNew conjugate gradient algorithms based on self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno methodNonmonotone quasi-Newton-based conjugate gradient methods with application to signal processingA modified scaled memoryless symmetric rank-one methodEigenvalue analyses on the memoryless Davidon-Fletcher-Powell method based on a spectral secant equationA NONMONOTONE ADMM-BASED DIAGONAL QUASI-NEWTON UPDATE WITH APPLICATION TO THE COMPRESSIVE SENSING PROBLEMA restart scheme for the memoryless BFGS methodAn approximate Newton-type proximal method using symmetric rank-one updating formula for minimizing the nonsmooth composite functionsA diagonally scaled Newton-type proximal method for minimization of the models with nonsmooth composite cost functionsA spectral conjugate gradient method for solving large-scale unconstrained optimizationAn adaptive nonmonotone trust region algorithmA new accelerated diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimizationAnalysis of the maximum magnification by the scaled memoryless DFP updating formula with application to compressive sensingTwo--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step lengthA modified nonmonotone trust region line search methodA linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS techniqueAn augmented memoryless BFGS method based on a modified secant equation with application to compressed sensingA new CG algorithm based on a scaled memoryless BFGS update with adaptive search strategy, and its application to large-scale unconstrained optimization problemsThe new spectral conjugate gradient method for large-scale unconstrained optimisationAn adaptive nonmonotone trust region method based on a modified scalar approximation of the Hessian in the successive quadratic subproblemsDiagonally scaled memoryless quasi-Newton methods with application to compressed sensing


Uses Software


Cites Work


This page was built for publication: On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae