New line search methods for unconstrained optimization
From MaRDI portal
Recommendations
- A new line search method with trust region for unconstrained optimization
- Descent property and global convergence of a new search direction method for unconstrained optimization
- A new nonmonotone line search technique for unconstrained optimization
- A new method for solving unconstrained optimization problems
- Convergence of descent method with new line search
Cites work
- scientific article; zbMATH DE number 3928227 (Why is no real title available?)
- scientific article; zbMATH DE number 1998925 (Why is no real title available?)
- scientific article; zbMATH DE number 3278849 (Why is no real title available?)
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- A Family of Trust-Region-Based Algorithms for Unconstrained Minimization with Strong Global Convergence Properties
- A New Algorithm for Unconstrained Optimization
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- A Trust Region Algorithm for Equality Constrained Minimization: Convergence Properties and Implementation
- A Trust Region Algorithm for Nonlinearly Constrained Optimization
- A class of gradient unconstrained minimization algorithms with adaptive stepsize
- A modified BFGS method and its global convergence in nonconvex minimization
- A new line search method with trust region for unconstrained optimization
- A new matrix-free algorithm for the large-scale trust-region subproblem
- A nonmonotone conjugate gradient algorithm for unconstrained optimization
- A note on minimization problems and multistep methods
- Benchmarking optimization software with performance profiles.
- CUTEr and SifDec
- Convergence Properties of the BFGS Algoritm
- Convergence of line search methods for unconstrained optimization
- Differential optimization techniques
- Function minimization by conjugate gradients
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Introduction to global optimization
- Local convergence analysis for partitioned quasi-Newton updates
- Methods of conjugate gradients for solving linear systems
- New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems
- New quasi-Newton equation and related methods for unconstrained optimization
- New quasi-Newton methods for unconstrained optimization problems
- Nonmonotone adaptive trust-region method for unconstrained optimization problems
- Numerical Optimization
- One-step and multistep procedures for constrained minimization problems
- Quasi-Newton Methods, Motivation and Theory
- Relaxed steepest descent and Cauchy-Barzilai-Borwein method
- Semiparametric and Nonparametric Regression Analysis of Longitudinal Data
- Semiparametric log-linear regression for longitudinal measurements subject to outcome-depen\-dent follow-up
- Testing Unconstrained Optimization Software
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- The convergence properties of some new conjugate gradient methods
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
Cited in
(29)- A Modified PRP Conjugate Gradient Algorithm with Trust Region for Optimization Problems
- Some global convergence properties of the Wei-Yao-Liu conjugate gradient method with inexact line search
- Efficient Line Search Methods for Convex Functions
- A conjugate gradient method with descent direction for unconstrained optimization
- A trust region algorithm with conjugate gradient technique for optimization problems
- Efficent line search algorithm for unconstrained optimization
- An active set limited memory BFGS algorithm for bound constrained optimization
- The global convergence of the BFGS method with a modified WWP line search for nonconvex functions
- Modified Hestenes-Steifel conjugate gradient coefficient for unconstrained optimization
- A conjugate gradient method for unconstrained optimization problems
- A mixed spectral CD-DY conjugate gradient method
- Global convergence of a spectral conjugate gradient method for unconstrained optimization
- Convergence analysis of a modified BFGS method on convex minimizations
- A New Method with Descent Property for Symmetric Nonlinear Equations
- New inexact line search method for unconstrained optimization
- An improved trust region method for unconstrained optimization
- A parallel dual matrix method for blind signal separation
- Global convergence of a modified Broyden family method for nonconvex functions
- scientific article; zbMATH DE number 1728407 (Why is no real title available?)
- Principal direction search: A new method of search for unconstrained LP formulations
- Non monotone backtracking inexact BFGS method for regression analysis
- A quasi-Newton algorithm for large-scale nonlinear equations
- A modified nonmonotone BFGS algorithm for unconstrained optimization
- A new line search method with trust region for unconstrained optimization
- A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches
- A new unidimensional search method for optimization: the 5/9 method
- A new class of nonlinear conjugate gradient coefficients with global convergence properties
- A BFGS algorithm for solving symmetric nonlinear equations
- A modified three-term PRP conjugate gradient algorithm for optimization models
This page was built for publication: New line search methods for unconstrained optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2510603)