The projection technique for two open problems of unconstrained optimization problems
From MaRDI portal
Publication:2194129
Recommendations
- Global convergence of the Polak-Ribière-Polyak conjugate gradient method with an Armijo-type inexact line search for nonconvex unconstrained optimization problems
- The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions
- The global convergence of a modified BFGS method for nonconvex functions
- A modified Broyden family algorithm with global convergence under a weak Wolfe-Powell line search for unconstrained nonconvex problems
- scientific article; zbMATH DE number 1159286
Cites work
- scientific article; zbMATH DE number 3843083 (Why is no real title available?)
- scientific article; zbMATH DE number 3790208 (Why is no real title available?)
- scientific article; zbMATH DE number 88930 (Why is no real title available?)
- scientific article; zbMATH DE number 3529352 (Why is no real title available?)
- scientific article; zbMATH DE number 1243473 (Why is no real title available?)
- scientific article; zbMATH DE number 1369459 (Why is no real title available?)
- scientific article; zbMATH DE number 778130 (Why is no real title available?)
- scientific article; zbMATH DE number 3278849 (Why is no real title available?)
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- A Family of Variable-Metric Methods Derived by Variational Means
- A New Algorithm for Unconstrained Optimization
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Rapidly Convergent Descent Method for Minimization
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- A class of derivative-free methods for large-scale nonlinear monotone equations
- A class of parameter estimation methods for nonlinear Muskingum model using hybrid invasive weed optimization algorithm
- A conjugate gradient method with descent direction for unconstrained optimization
- A globally convergent version of the Polak-Ribière conjugate gradient method
- A hybrid MBFGS and CBFGS method for nonconvex minimization with a global complexity bound
- A modified BFGS method and its global convergence in nonconvex minimization
- A modified BFGS type quasi-Newton method with line search for symmetric nonlinear equations problems
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- A modified PRP conjugate gradient method
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- A new approach to variable metric algorithms
- A perfect example for the BFGS method
- A projection method for a system of nonlinear monotone equations with convex constraints
- A projection method for convex constrained monotone nonlinear equations with applications
- A short note on the global convergence of the unmodified PRP method
- A three-term derivative-free projection method for nonlinear monotone system of equations
- A three-terms Polak-Ribière-Polyak conjugate gradient algorithm for large-scale nonlinear equations
- Algorithm 851
- An SQP-type method and its application in stochastic programs
- An unconstrained optimization test functions collection
- Benchmarking optimization software with performance profiles.
- Conditioning of Quasi-Newton Methods for Function Minimization
- Convergence Properties of Algorithms for Nonlinear Optimization
- Convergence Properties of the BFGS Algoritm
- Convergence analysis of a modified BFGS method on convex minimizations
- Convergence of DFP algorithm
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Efficient hybrid conjugate gradient techniques
- Function minimization by conjugate gradients
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- Global convergence of the partitioned BFGS algorithm for convex partially separable optimization
- Local convergence analysis for partitioned quasi-Newton updates
- Methods of conjugate gradients for solving linear systems
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- New quasi-Newton equation and related methods for unconstrained optimization
- On the Convergence of the Variable Metric Algorithm
- On the Local and Superlinear Convergence of Quasi-Newton Methods
- On the convergence properties of the unmodified PRP method with a non-descent line search
- On the global convergence of the BFGS method for nonconvex unconstrained optimization problems
- Quasi-Newton Methods, Motivation and Theory
- Spectral gradient method for impulse noise removal
- Spectral gradient projection method for solving nonlinear monotone equations
- The BFGS method with exact line searches fails for non-convex objective functions
- The Convergence of a Class of Double-rank Minimization Algorithms
- The conjugate gradient method in extremal problems
- The convergence properties of some new conjugate gradient methods
- The divergence of the BFGS and Gauss Newton methods
- The global convergence of a modified BFGS method for nonconvex functions
- The global convergence of partitioned BFGS on problems with convex decompositions and Lipschitzian gradients
- The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- Variable Metric Method for Minimization
- Variable metric algorithms: Necessary and sufficient conditions for identical behaviour of nonquadratic functions
Cited in
(10)- A class of spectral three-term descent Hestenes-Stiefel conjugate gradient algorithms for large-scale unconstrained optimization and image restoration problems
- A \(q\)-Polak-Ribière-Polyak conjugate gradient algorithm for unconstrained optimization problems
- An adaptive projection BFGS method for nonconvex unconstrained optimization problems
- Two-Metric Projection Methods for Constrained Optimization
- Global convergence of a cautious projection BFGS algorithm for nonconvex problems without gradient Lipschitz continuity
- A class of three-term derivative-free methods for large-scale nonlinear monotone system of equations and applications to image restoration problems
- A modified HZ conjugate gradient algorithm without gradient Lipschitz continuous condition for non convex functions
- A class of new three-term descent conjugate gradient algorithms for large-scale unconstrained optimization and applications to image restoration problems
- A Dai-Liao-type projection method for monotone nonlinear equations and signal processing
- A structured L-BFGS method and its application to inverse problems
This page was built for publication: The projection technique for two open problems of unconstrained optimization problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2194129)