Gradient-only approaches to avoid spurious local minima in unconstrained optimization
From MaRDI portal
Recommendations
- Gradient methods for nonstationary unconstrained optimization problems
- scientific article; zbMATH DE number 1960973
- Global descent methods for unconstrained global optimization
- scientific article; zbMATH DE number 703004
- On the steplength selection in gradient methods for unconstrained optimization
- The conjugate gradient method for unconstrained minimization
- scientific article; zbMATH DE number 3930748
- Global convergence and stabilization of unconstrained minimization methods without derivatives
- On the Global Convergence of Derivative-Free Methods for Unconstrained Optimization
- scientific article; zbMATH DE number 2196505
Cites work
- scientific article; zbMATH DE number 417962 (Why is no real title available?)
- scientific article; zbMATH DE number 41026 (Why is no real title available?)
- scientific article; zbMATH DE number 613868 (Why is no real title available?)
- scientific article; zbMATH DE number 3894826 (Why is no real title available?)
- scientific article; zbMATH DE number 5060482 (Why is no real title available?)
- A class of globally convergent optimization methods based on conservative convex separable approximations
- A gradient-only line search method for the conjugate gradient method applied to constrained optimization problems with severe noise in the objective function
- A quadratically convergent unstructured remeshing strategy for shape optimization
- Discontinuous Optimization by Smoothing
- Discontinuous piecewise linear optimization
- Incomplete series expansion for function approximation
- Practical mathematical optimization. An introduction to basic optimization theory and classical and new gradient-based algorithms.
- Structural optimization using sensitivity analysis and a level-set method.
- The application of gradient-only optimization methods for problems discretized using non-constant methods
- The dynamic-Q optimization method: an alternative to SQP?
- The spherical quadratic steepest descent (SQSD) method for unconstrained minimization with no explicit line searches
Cited in
(5)- On the Absence of Spurious Local Trajectories in Time-Varying Nonconvex Optimization
- Resolving learning rates adaptively by locating stochastic non-negative associated gradient projection points using line searches
- The application of gradient-only optimization methods for problems discretized using non-constant methods
- An empirical study into finding optima in stochastic optimization of neural networks
- Finding approximate local minima faster than gradient descent
This page was built for publication: Gradient-only approaches to avoid spurious local minima in unconstrained optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q402201)