Global convergence and stabilization of unconstrained minimization methods without derivatives
From MaRDI portal
Recommendations
- On the Global Convergence of Derivative-Free Methods for Unconstrained Optimization
- Convergence and stability of line search methods for unconstrained optimization
- Global convergence of conjugate gradient methods without line search
- Full convergence of the steepest descent method with inexact line searches
- scientific article; zbMATH DE number 90302
Cites work
- scientific article; zbMATH DE number 3687182 (Why is no real title available?)
- scientific article; zbMATH DE number 3748742 (Why is no real title available?)
- scientific article; zbMATH DE number 3381785 (Why is no real title available?)
- scientific article; zbMATH DE number 3407464 (Why is no real title available?)
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- A note on a sufficient-decrease criterion for a non-derivative step-length procedure
- A superlinearly convergent algorithm for minimization without evaluating derivatives
- An effective algorithm for minimization
- Stopping criteria for linesearch methods without derivatives
Cited in
(21)- Extended global convergence framework for unconstrained optimization
- Gradient-only approaches to avoid spurious local minima in unconstrained optimization
- A parameter-free unconstrained reformulation for nonsmooth problems with convex constraints
- A local search method for costly black-box problems and its application to CSP plant start-up optimization refinement
- Global convergence technique for the Newton method with periodic Hessian evaluation
- scientific article; zbMATH DE number 90302 (Why is no real title available?)
- Minimization of \(SC^ 1\) functions and the Maratos effect
- A robust optimization approach for magnetic spacecraft attitude stabilization
- Frame-based ray search algorithms in unconstrained optimization
- Nonmonotone derivative-free methods for nonlinear equations
- Convergence conditions, line search algorithms and trust region implementations for the Polak–Ribière conjugate gradient method
- A globally convergent version of the Polak-Ribière conjugate gradient method
- On convergence analysis of a derivative-free trust region algorithm for constrained optimization with separable structure
- A class of derivative-free nonmonotone optimization algorithms employing coordinate rotations and gradient approximations
- Optimizing partially separable functions without derivatives
- On the Global Convergence of Derivative-Free Methods for Unconstrained Optimization
- Perturbed steepest-descent technique in multiextremal problems
- Stationarity and convergence in reduce-or-retreat minimization.
- Convergence and stability of line search methods for unconstrained optimization
- Globally convergent block-coordinate techniques for unconstrained optimization
- Derivative-free optimization methods
This page was built for publication: Global convergence and stabilization of unconstrained minimization methods without derivatives
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1090237)