Multi-step quasi-Newton methods for optimization
From MaRDI portal
Recommendations
- scientific article; zbMATH DE number 1552870
- New parameterization for multi-step quasi-Newton algorithms
- Alternating multi-step quasi-Newton methods for unconstrained optimization
- scientific article; zbMATH DE number 11435
- Quasi-Newton methods for multiobjective optimization problems
- Quasi-Newton methods for solving multiobjective optimization
- scientific article; zbMATH DE number 1424421
- Quasi-Newton's method for multiobjective optimization
- New implicit multi-step quasi-Newton methods
- scientific article; zbMATH DE number 782037
Cites work
- scientific article; zbMATH DE number 4091191 (Why is no real title available?)
- scientific article; zbMATH DE number 67160 (Why is no real title available?)
- scientific article; zbMATH DE number 3474862 (Why is no real title available?)
- scientific article; zbMATH DE number 1243473 (Why is no real title available?)
- scientific article; zbMATH DE number 3436543 (Why is no real title available?)
- A Class of Methods for Solving Nonlinear Simultaneous Equations
- A Note on Minimization Algorithms which make Use of Non-quardratic Properties of the Objective Function
- A Quadratically Convergent Krawczyk-Like Algorithm
- A Quasi-Newton Method Employing Direct Secant Updates of Matrix Factorizations
- A conjugate direction implementation of the BFGS algorithm with automatic scaling
- Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4]
- Correction to the Paper on Global Convergence of a Class of Trust Region Algorithms for Optimization with Simple Bounds
- Direct Secant Updates of Matrix Factorizations
- Inversionsfreie Verfahren zur Lösung nichtlinearer Operatorgleichungen
- Minimization Algorithms Making Use of Non-quadratic Properties of the Objective Function
- On Large Scale Nonlinear Least Squares Calculations
- On the construction of minimization methods of quasi-Newton type
- On the solution of highly structured nonlinear equations
- On the use of curvature estimates in quasi-Newton methods
- On the use of function-values in unconstrained optimisation
- Quasi-Newton Methods for Discretized Non-linear Boundary Problems
- Testing Unconstrained Optimization Software
Cited in
(68)- New gradient methods with adaptive stepsizes by approximate models
- scientific article; zbMATH DE number 5137319 (Why is no real title available?)
- Modified multi-step quasi-Newton methods with on the use function value information
- Eigenvalue analyses on the memoryless Davidon-Fletcher-Powell method based on a spectral secant equation
- Quasi-Newton methods based on dispersed methods of recovery of the Hessian
- Variance matrix estimation in multivariate classical measurement error models
- New dual parametrization of quasi-Newton algorithms
- scientific article; zbMATH DE number 701588 (Why is no real title available?)
- Approximating Higher-Order Derivative Tensors Using Secant Updates
- scientific article; zbMATH DE number 3894351 (Why is no real title available?)
- Higher order curvature information and its application in a modified diagonal Secant method
- Multiple update multi-step methods for unconstrained optimization
- A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update
- Optimization of Simulation via Quasi-Newton Methods
- On the construction of minimization methods of quasi-Newton type
- Multi-step spectral gradient methods with modified weak secant relation for large scale unconstrained optimization
- scientific article; zbMATH DE number 1552870 (Why is no real title available?)
- Enhanced Dai-Liao conjugate gradient methods for systems of monotone nonlinear equations
- A modified secant equation quasi-Newton method for unconstrained optimization
- Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems
- A new two-step gradient-type method for large-scale unconstrained optimization
- Accelerated augmented Lagrangian method for total variation minimization
- Estimation of the variance matrix in bivariate classical measurement error models
- A nonlinear model for function-value multistep methods
- Extra multistep BFGS updates in quasi-Newton methods
- Multistep approximation algorithms: Improved convergence rates through postconditioning with smoothing kernels
- The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices
- Adding variables to quasi-newton Hessian approximations
- A new conjugate gradient algorithm for training neural networks based on a modified secant equation
- Local and superlinear convergence of quasi-Newton methods based on modified secant conditions
- Quasi-Newton method: a new direction
- Multi-step nonlinear conjugate gradient methods for unconstrained minimization
- Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing
- A modified BFGS algorithm based on a hybrid secant equation
- A quasi-Newton acceleration for high-dimensional optimization algorithms
- Numerical experience with multiple update quasi-Newton methods for unconstrained optimization
- scientific article; zbMATH DE number 1424421 (Why is no real title available?)
- Descent Perry conjugate gradient methods for systems of monotone nonlinear equations
- Using nonlinear functions to approximate a new quasi-Newton method for unconstrained optimization problems
- A new super-memory gradient method with curve search rule
- Competitive secant (BFGS) methods based on modified secant relations for unconstrained optimization
- scientific article; zbMATH DE number 740991 (Why is no real title available?)
- Two-step conjugate gradient method for unconstrained optimization
- A new modified Barzilai-Borwein gradient method for the quadratic minimization problem
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
- Some modified Yabe–Takano conjugate gradient methods with sufficient descent condition
- Globally convergent modified Perry's conjugate gradient method
- The use of alternation and recurrences in two-step quasi-Newton methods
- A hybrid quasi-Newton method with application in sparse recovery
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- Improved Hessian approximation with modified secant equations for symmetric rank-one method
- Global convergence property of scaled two-step BFGS method
- Two new conjugate gradient methods based on modified secant equations
- Alternating multi-step quasi-Newton methods for unconstrained optimization
- An efficient gradient method with approximate optimal stepsize for the strictly convex quadratic minimization problem
- A descent family of Dai-Liao conjugate gradient methods
- A multi-iterate method to solve systems of nonlinear equations
- New implicit updates in multi-step quasi-Newton methods for unconstrained optimisation
- A new class of memory gradient methods with inexact line searches
- New parameterization for multi-step quasi-Newton algorithms
- Implicit updates in multistep quasi-Newton methods
- Variable metric methods for unconstrained optimization and nonlinear least squares
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- An improved nonlinear conjugate gradient method with an optimal property
- Three-step fixed-point quasi-Newton methods for unconstrained optimisation
- New nonlinear conjugate gradient methods based on optimal Dai-Liao parameters
- Minimum curvature multistep quasi-Newton methods
- Using function-values in multi-step quasi-Newton methods
This page was built for publication: Multi-step quasi-Newton methods for optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1334773)