Scaling damped limited-memory updates for unconstrained optimization
From MaRDI portal
Publication:306311
DOI10.1007/S10957-016-0940-ZzbMATH Open1346.90582OpenAlexW2343706742MaRDI QIDQ306311FDOQ306311
Authors: Fahimeh Biglari, Farideh Mahmoodpur
Publication date: 31 August 2016
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10957-016-0940-z
Recommendations
- Updating the self-scaling symmetric rank one algorithm with limited memory for large-scale unconstrained optimization
- Modifications of the limited memory BFGS algorithm for large-scale nonlinear optimization
- Dynamic scaling on the limited memory BFGS method
- Damped techniques for the limited memory BFGS method for large-scale optimization
- scientific article; zbMATH DE number 5630374
nonlinear programminglarge-scale optimizationdamped techniquelimited-memory quasi-Newton methodsscaling technique
Cites Work
- Title not available (Why is that?)
- Benchmarking optimization software with performance profiles.
- On the limited memory BFGS method for large scale optimization
- Conditioning of Quasi-Newton Methods for Function Minimization
- Line search algorithms with guaranteed sufficient decrease
- An unconstrained optimization test functions collection
- Some numerical experiments with variable-storage quasi-Newton algorithms
- Accuracy and Stability of Numerical Algorithms
- Sizing and Least-Change Secant Methods
- Dynamic scaling based preconditioning for truncated Newton methods in large scale unconstrained optimization
- Matrix conditioning and nonlinear optimization
- A modified BFGS method and its global convergence in nonconvex minimization
- New quasi-Newton equation and related methods for unconstrained optimization
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- Condition numbers and equilibration of matrices
- Error Analysis of Direct Methods of Matrix Inversion
- A survey of quasi-Newton equations and quasi-Newton methods for optimization
- A Modified BFGS Algorithm for Unconstrained Optimization
- A Comparative Study of Algorithms for Matrix Balancing
- Self-Scaling Variable Metric (SSVM) Algorithms
- Optimal conditioning of self-scaling variable Metric algorithms
- Global and superlinear convergence of a restricted class of self-scaling methods with inexact line searches, for convex functions
- Damped techniques for the limited memory BFGS method for large-scale optimization
- Limited memory BFGS method based on a high-order tensor model
- Local and superlinear convergence of quasi-Newton methods based on modified secant conditions
- Computational experiments with scaled initial hessian approximation for the broyden family methods∗
- Dynamic scaling on the limited memory BFGS method
- On practical modifications of the quasi-Newton BFGS method
- A new class of quasi-Newton updating formulas
- On the selection of parameters in Self Scaling Variable Metric Algorithms
- Algorithms for nonlinear constraints that use lagrangian functions
- A combined class of self-scaling and modified quasi-Newton methods
- The Quasi-Cauchy Relation and Diagonal Updating
- Equilibration of Symmetric Matrices in the Max-Norm
- Limited-memory BFGS diagonal preconditioners for a data assimilation problem in meteorology
Cited In (3)
Uses Software
This page was built for publication: Scaling damped limited-memory updates for unconstrained optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q306311)