Optimal conditioning of self-scaling variable Metric algorithms

From MaRDI portal
Revision as of 07:43, 6 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:4110808

DOI10.1007/BF01580654zbMath0342.90045OpenAlexW2061054410MaRDI QIDQ4110808

Emilio Spedicato, Shmuel S. Oren

Publication date: 1976

Published in: Mathematical Programming (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1007/bf01580654




Related Items (97)

A hybrid scaling parameter for the scaled memoryless BFGS method based on the ℓ matrix normNumerical simulation of tridimensional electromagnetic shaping of liquid metalsAccelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS updateComputational experience with known variable metric updatesA descent hybrid conjugate gradient method based on the memoryless BFGS updateFamily of optimally conditioned quasi-Newton updates for unconstrained optimizationA quasi-Newton method using a nonquadratic modelA modified scaling parameter for the memoryless BFGS updating formulaNumerical experiments with variations of the Gauss-Newton algorithm for nonlinear least squaresA class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS updateA trust-region strategy for minimization on arbitrary domainsScaling damped limited-memory updates for unconstrained optimizationMechanical system modelling using recurrent neural networks via quasi- Newton learning methodsMaintaining the positive definiteness of the matrices in reduced secant methods for equality constrained optimizationVariationally derived algorithms in the ABS class for linear systemsThe global convergence of self-scaling BFGS algorithm with non-monotone line search for unconstrained nonconvex optimization problemsTwo accelerated nonmonotone adaptive trust region line search methodsTwo extensions of the Dai-Liao method with sufficient descent property based on a penalization schemeBroyden's quasi-Newton methods for a nonlinear system of equations and unconstrained optimization: a review and open problemsNonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty modelA modified Dai-Kou-type method with applications to signal reconstruction and blurred image restorationA double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimizationUsing function-values in multi-step quasi-Newton methodsSymmetric Perry conjugate gradient methodTwo modified scaled nonlinear conjugate gradient methodsOn optimality of the parameters of self-scaling memoryless quasi-Newton updating formulaeAn Accelerated Three-Term Extension of a Descent Nonlinear Conjugate Gradient MethodNew conjugate gradient algorithms based on self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno methodA double parameter self-scaling memoryless BFGS method for unconstrained optimizationA projection-based derivative free DFP approach for solving system of nonlinear convex constrained monotone equations with image restoration applicationsA modified scaled memoryless symmetric rank-one methodEigenvalue analyses on the memoryless Davidon-Fletcher-Powell method based on a spectral secant equationA NONMONOTONE ADMM-BASED DIAGONAL QUASI-NEWTON UPDATE WITH APPLICATION TO THE COMPRESSIVE SENSING PROBLEMComputational experience with conjugate gradient algorithmsAn Optimal Broyden Updating Formula And Its Application To Nonliner Least SquaresA descent family of the spectral Hestenes–Stiefel method by considering the quasi-Newton methodAn approximate Newton-type proximal method using symmetric rank-one updating formula for minimizing the nonsmooth composite functionsA diagonally scaled Newton-type proximal method for minimization of the models with nonsmooth composite cost functionsA scaled nonlinear conjugate gradient algorithm for unconstrained optimizationPerspectives on self-scaling variable metric algorithmsGaussian processes for history-matching: application to an unconventional gas reservoirNew Basic Hessian Approximations for Large-Scale Nonlinear Least-Squares OptimizationNew combined method for unconstrained minimizationSome numerical experiments with variable-storage quasi-Newton algorithmsQuasi-Newton-Verfahren vom Rang-Eins-Typ zur Lösung unrestringierter Minimierungsprobleme. I: Verfahren und grundlegende EigenschaftenQuasi-Newton-Verfahren vom Rang-Eins-Typ zur Lösung unrestringierter Minimierungsprobleme. II: n-Schritt-quadratische Konvergenz für Restart-VariantenSuperlinear convergence of symmetric Huang's class of methodsA class of self-dual updating formulae in the broyden familyPartitioned variable metric updates for large structured optimization problemsA modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimizationA combined class of self-scaling and modified quasi-Newton methodsComputational experiments with scaled initial hessian approximation for the broyden family methodsConjugate gradient methods using quasi-Newton updates with inexact line searchesUnnamed ItemAn adaptive nonmonotone trust region algorithmConvergence analysis of the self-dual optimally conditioned ssvm method of oren-spedicatoConvergence acceleration of direct trajectory optimization using novel Hessian calculation methodsLow rank updates in preconditioning the saddle point systems arising from data assimilation problemsA new accelerated diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimizationCubic regularization in symmetric rank-1 quasi-Newton methodsGlobal convergence property of scaled two-step BFGS methodAnalysis of the maximum magnification by the scaled memoryless DFP updating formula with application to compressive sensingErratum to: Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimizationA class of diagonal preconditioners for limited memory BFGS methodTwo--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step lengthAccelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimizationA variable metric-method for function minimization derived from invariancy to nonlinear scalingA bound to the condition number of canonical rank-two corrections and applications to the variable metric methodOn the behaviour of a combined extra-updating/self-scaling BFGS methodScaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimizationMatrix conditioning and nonlinear optimizationAn assessment of two approaches to variable metric methodsNew accelerated conjugate gradient algorithms as a modification of Dai-Yuan's computational scheme for unconstrained optimizationScaled memoryless symmetric rank one method for large-scale optimizationA modified nonmonotone trust region line search methodAn adaptive family of projection methods for constrained monotone nonlinear equations with applicationsAn augmented memoryless BFGS method based on a modified secant equation with application to compressed sensingSuperlinear convergence of nonlinear conjugate gradient method and scaled memoryless BFGS method based on assumptions about the initial pointWide interval for efficient self-scaling quasi-Newton algorithmsOn a conjecture of Dixon and other topics in variable metric methodsL-Broyden methods: a generalization of the L-BFGS method to the limited-memory Broyden familyA new descent spectral Polak-Ribière-Polyak method based on the memoryless BFGS updateNumerical expirience with a class of self-scaling quasi-Newton algorithmsOptimal conditioning in the convex class of rank two updatesA new gradient method via quasi-Cauchy relation which guarantees descentComputational experience with bank-one positive definite quasi-newton algorithmsUnnamed ItemAn adaptive nonmonotone trust region method based on a modified scalar approximation of the Hessian in the successive quadratic subproblemsDiagonally scaled memoryless quasi-Newton methods with application to compressed sensingQN-like variable storage conjugate gradientsSome investigations in a new algorithm for nonlinear optimization based on conic models of the objective functionVariable metric methods for unconstrained optimization and nonlinear least squaresOn measure functions for the self-scaling updating formulae for quasi-newton methodsOn the conditioning of the Hessian approximation in quasi-Newton methodsVariationally derived scaling and variable metric updates from the preconvex part of the Broyden familySizing the BFGS and DFP updates: Numerical studyA fast and robust unconstrained optimization method requiring minimum storage



Cites Work


This page was built for publication: Optimal conditioning of self-scaling variable Metric algorithms