Optimization by moving ridge functions: derivative-free optimization for computationally intensive functions
From MaRDI portal
Publication:6094491
Abstract: A novel derivative-free algorithm, optimization by moving ridge functions (OMoRF), for unconstrained and bound-constrained optimization is presented. This algorithm couples trust region methodologies with output-based dimension reduction to accelerate convergence of model-based optimization strategies. The dimension-reducing subspace is updated as the trust region moves through the function domain, allowing OMoRF to be applied to functions with no known global low-dimensional structure. Furthermore, its low computational requirement allows it to make rapid progress when optimizing high-dimensional functions. Its performance is examined on a set of test problems of moderate to high dimension and a high-dimensional design optimization problem. The results show that OMoRF compares favourably to other common derivative-free optimization methods, even for functions in which no underlying global low-dimensional structure is known.
Recommendations
- Derivative-free optimization of expensive functions with computational error using weighted regression
- Optimization algorithm based on densification and dynamic canonical descent
- Wedge trust region method for derivative free optimization.
- Branch-and-Model: a derivative-free global optimization algorithm
- Finding local optima of high-dimensional functions using direct search methods
Cites work
- scientific article; zbMATH DE number 653035 (Why is no real title available?)
- scientific article; zbMATH DE number 6276119 (Why is no real title available?)
- A Simplex Method for Function Minimization
- A dimensionality reduction technique for unconstrained global optimization of functions with low effective dimensionality
- A note on performance profiles for benchmarking software
- A stochastic subspace approach to gradient-free optimization in high dimensions
- Active subspace methods in theory and practice: applications to kriging surfaces
- Active subspaces. Emerging ideas for dimension reduction in parameter studies
- Bayesian optimization in a billion dimensions via random embeddings
- Benchmarking Derivative-Free Optimization Algorithms
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- Computation of sparse low degree interpolating polynomials and their application to derivative-free optimization
- Convergence of trust-region methods based on probabilistic models
- Data-driven polynomial ridge approximation using variable projection
- Design-space dimensionality reduction in shape optimization by Karhunen-Loève expansion
- Dimension reduction in magnetohydrodynamics power generation models: Dimensional analysis and active subspaces
- Embedded ridge approximations
- Geometry of interpolation sets in derivative free optimization
- Global convergence of general derivative-free trust-region algorithms to first- and second-order critical points
- Improving the flexibility and robustness of model-based derivative-free optimization solvers
- Introduction to Derivative-Free Optimization
- On the geometry phase in model-based algorithms for derivative-free optimization
- Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates
- Ridge functions
- The NEWUOA software for unconstrained optimization without derivatives
- Time‐dependent global sensitivity analysis with active subspaces for a lithium ion battery model
- Trust Region Methods
Cited in
(11)- \(Q\)-fully quadratic modeling and its application in a random subspace derivative-free method
- A subset-selection-based derivative-free optimization algorithm for dynamic operation optimization in a steel-making process
- Geodesic and contour optimization using conformal mapping
- Scalable subspace methods for derivative-free nonlinear least-squares optimization
- Hierarchical gradient-based optimization with B-splines on sparse grids
- Stochastic trust-region algorithm in random subspaces with convergence and expected complexity analyses
- Optimization algorithm based on densification and dynamic canonical descent
- New horizons in sphere-packing theory, part II: Lattice-based derivative-free optimization via global surrogates
- Derivative-free optimization of expensive functions with computational error using weighted regression
- Branch-and-Model: a derivative-free global optimization algorithm
- On the representability of a continuous multivariate function by sums of ridge functions
This page was built for publication: Optimization by moving ridge functions: derivative-free optimization for computationally intensive functions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6094491)