Gradient-only approaches to avoid spurious local minima in unconstrained optimization
DOI10.1007/S11081-011-9178-7zbMATH Open1294.65071OpenAlexW2140303265MaRDI QIDQ402201FDOQ402201
Schalk Kok, Daniel N. Wilke, Johannes Arnoldus Snyman, Albert A. Groenwold
Publication date: 27 August 2014
Published in: Optimization and Engineering (Search for Journal in Brave)
Full work available at URL: http://hdl.handle.net/2263/39764
unconstrained optimizationshape optimizationpartial differential equationsgradient-only optimizationstep discontinuousvariable discretization strategies
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Practical mathematical optimization. An introduction to basic optimization theory and classical and new gradient-based algorithms.
- Title not available (Why is that?)
- Structural optimization using sensitivity analysis and a level-set method.
- Incomplete series expansion for function approximation
- A class of globally convergent optimization methods based on conservative convex separable approximations
- Title not available (Why is that?)
- Discontinuous Optimization by Smoothing
- Title not available (Why is that?)
- The dynamic-Q optimization method: an alternative to SQP?
- A quadratically convergent unstructured remeshing strategy for shape optimization
- The application of gradient-only optimization methods for problems discretized using non-constant methods
- A gradient-only line search method for the conjugate gradient method applied to constrained optimization problems with severe noise in the objective function
- The spherical quadratic steepest descent (SQSD) method for unconstrained minimization with no explicit line searches
- Discontinuous piecewise linear optimization
Cited In (4)
- On the Absence of Spurious Local Trajectories in Time-Varying Nonconvex Optimization
- Resolving learning rates adaptively by locating stochastic non-negative associated gradient projection points using line searches
- Finding approximate local minima faster than gradient descent
- An empirical study into finding optima in stochastic optimization of neural networks
Uses Software
Recommendations
- Gradient methods for nonstationary unconstrained optimization problems π π
- Title not available (Why is that?) π π
- Global descent methods for unconstrained global optimization π π
- Title not available (Why is that?) π π
- On the steplength selection in gradient methods for unconstrained optimization π π
- The conjugate gradient method for unconstrained minimization π π
- Title not available (Why is that?) π π
- Global convergence and stabilization of unconstrained minimization methods without derivatives π π
- On the Global Convergence of Derivative-Free Methods for Unconstrained Optimization π π
- Title not available (Why is that?) π π
This page was built for publication: Gradient-only approaches to avoid spurious local minima in unconstrained optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q402201)