Gradient and diagonal Hessian approximations using quadratic interpolation models and aligned regular bases
From MaRDI portal
Publication:820723
DOI10.1007/s11075-020-01056-8zbMath1487.65025OpenAlexW3139102753WikidataQ114224303 ScholiaQ114224303MaRDI QIDQ820723
Rachael Tappenden, Ian D. Coope
Publication date: 27 September 2021
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-020-01056-8
derivative free optimizationfinite difference approximationspositive basessimplicesinterpolation modelssimplex gradients
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Derivative-free methods and methods using generalized derivatives (90C56) Numerical differentiation (65D25)
Related Items
Ensemble-Based Gradient Inference for Particle Methods in Optimization and Sampling, Approximating the diagonal of a Hessian: which sample set of points should be used
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Computation of sparse low degree interpolating polynomials and their application to derivative-free optimization
- Recent progress in unconstrained nonlinear optimization without derivatives
- An implicit filtering algorithm for derivative-free multiobjective optimization with box constraints
- Uniform simplex of an arbitrary orientation
- Random gradient-free minimization of convex functions
- Efficient calculation of regular simplex gradients
- Geometry of interpolation sets in derivative free optimization
- A Hitchhiker's guide to automatic differentiation
- Global Convergence of Radial Basis Function Trust-Region Algorithms for Derivative-Free Optimization
- Geometry of sample sets in derivative-free optimization: polynomial regression and underdetermined interpolation
- On the geometry phase in model-based algorithms for derivative-free optimization
- Introduction to Derivative-Free Optimization
- Derivative-Free and Blackbox Optimization
- A Derivative-Free Trust-Region Algorithm for the Optimization of Functions Smoothed via Gaussian Convolution Using Adaptive Multiple Importance Sampling
- An Implicit Filtering Algorithm for Optimization of Functions with Many Local Minima
- Adaptive Interpolation Strategies in Derivative-Free Optimization: a case study
- A Simplex Method for Function Minimization
- Sequential Application of Simplex Designs in Optimisation and Evolutionary Operation
- Calculus Identities for Generalized Simplex Gradients: Rules and Applications
- Frame based methods for unconstrained optimization