Derivative-free optimization methods
Publication:5230522
DOI10.1017/S0962492919000060zbMath1461.65169DBLPjournals/actanum/LarsonMW19arXiv1904.11585OpenAlexW3102143361WikidataQ90161505 ScholiaQ90161505MaRDI QIDQ5230522
Jeffrey Larson, Stefan M. Wild, Matt Menickelly
Publication date: 28 August 2019
Published in: Acta Numerica (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1904.11585
complexityminimaxalgorithmderivative free optimizationconvergencesimulationstochasticaugmented LagrangianSQPlocal and global optimizationpenaltymulti-objectiveconstrainedGPSMADSbilevelmultistartDDSunconstrainedcomposite non smoothmulti fidelityTrust Regionunder estimatorWCC
Numerical mathematical programming methods (65K05) Derivative-free methods and methods using generalized derivatives (90C56) Research exposition (monographs, survey articles) pertaining to operations research and mathematical programming (90-02) Computational methods for problems pertaining to operations research and mathematical programming (90-08)
Related Items (72)
Uses Software
Cites Work
- On the optimal order of worst case complexity of direct search
- Stochastic derivative-free optimization using a trust region framework
- On Lipschitz optimization based on gray-box piecewise linearization
- Bilevel direct search method for leader-follower problems and application in health insurance
- A derivative-free approximate gradient sampling algorithm for finite minimax problems
- Sobolev seminorm of quadratic functions with applications to derivative-free optimization
- A derivative-free algorithm for linearly constrained optimization problems
- On the local convergence of a derivative-free algorithm for least-squares minimization
- Analysis of direct searches for discontinuous functions
- Implementing the Nelder-Mead simplex algorithm with adaptive parameters
- Stochastic recursive algorithms for optimization. Simultaneous perturbation methods
- Recent advances in algorithmic differentiation. Selected papers based on the presentations at the 6th international conference on automatic differentiation (AD2012), Fort Collins, CO, USA, July 23--27, 2012.
- On fast trust region methods for quadratic models with linear constraints
- SO-MI: a surrogate model algorithm for computationally expensive nonlinear mixed-integer black-box global optimization problems
- An inexact restoration derivative-free filter method for nonlinear programming
- Direct search algorithm for bilevel programming problems
- Derivative-free methods for nonlinear programming with general lower-level constraints
- Active-set strategy in Powell's method for optimization without derivatives
- CONDOR, a new parallel, constrained extension of Powell's UOBYQA algorithm: Experimental results and comparison with the DFO algorithm
- A convergent variant of the Nelder--Mead algorithm
- Robust optimization with simulated annealing
- Computation of sparse low degree interpolating polynomials and their application to derivative-free optimization
- Do you trust derivatives or differences?
- Worst case complexity of direct search
- SO-I: a surrogate model algorithm for expensive nonlinear integer programming problems including global optimization applications
- Nonlinear programming without a penalty function or a filter
- Grid restrained Nelder-Mead algorithm
- Additive scaling and the \texttt{DIRECT} algorithm
- Mesh adaptive direct search with second directional derivative-based Hessian update
- Global convergence of trust-region algorithms for convex constrained minimization without derivatives
- A derivative-free nonmonotone line-search technique for unconstrained optimization
- Discrete gradient method: Derivative-free method for nonsmooth optimization
- Design and implementation of a massively parallel version of DIRECT
- Variable-number sample-path optimization
- A method for stochastic constrained optimization using derivative-free surrogate pattern search and collocation
- Incorporating minimum Frobenius norm models in direct search
- Mesh adaptive direct search algorithms for mixed variable optimization
- Generalized pattern search methods for a class of nonsmooth optimization problems with structure
- Pattern search ranking and selection algorithms for mixed variable simulation-based optimization
- Asymptotically efficient adaptive allocation rules
- Global convergence and stabilization of unconstrained minimization methods without derivatives
- Pure adaptive search in Monte Carlo optimization
- Simplex direct search algorithms
- Pure adaptive search in global optimization
- Efficient global optimization of expensive black-box functions
- Global optimization by multilevel coordinate search
- Lipschitzian optimization without the Lipschitz constant
- Recent progress in unconstrained nonlinear optimization without derivatives
- On trust region methods for unconstrained minimization without derivatives
- Introductory lectures on convex optimization. A basic course.
- Optimization of automotive valve train components with implicit filtering
- Direct search methods: Then and now
- Objective-derivative-free methods for constrained optimization
- Gilding the lily: A variant of the Nelder-Mead algorithm based on Golden-section search
- Asynchronously parallel optimization solver for finding multiple minima
- Global convergence rate analysis of unconstrained optimization methods based on probabilistic models
- Stochastic optimization using a trust-region method and random models
- Robust optimization of noisy blackbox problems using the mesh adaptive direct search algorithm
- GOSAC: global optimization with surrogate approximation of constraints
- Compositions of convex functions and fully linear models
- On the construction of quadratic models for derivative-free trust-region algorithms
- Conditional gradient type methods for composite nonlinear and stochastic optimization
- MILP models for the selection of a small set of well-distributed points
- An implicit filtering algorithm for derivative-free multiobjective optimization with box constraints
- Efficient solution of quadratically constrained quadratic subproblems within the mesh adaptive direct search algorithm
- MultiGLODS: global and local multiobjective optimization using direct search
- UOBYQA: unconstrained optimization by quadratic approximation
- Global optimization of costly nonconvex functions using radial basis functions
- Algorithms for noisy problems in gas transmission pipeline optimization
- Exploiting band structure in unconstrained optimization without derivatives
- On the convergence of the UOBYQA method
- Constrained optimization involving expensive function evaluations: A sequential approach
- Generalized pattern searches with derivative information
- Least Frobenius norm updating of quadratic models that satisfy interpolation conditions
- Stochastic Nelder-Mead simplex method -- a new globally convergent direct search method for simulation optimization
- Derivative-free methods for bound constrained mixed-integer optimization
- On the convergence of trust region algorithms for unconstrained minimization without derivatives
- Beyond symmetric Broyden for updating quadratic models in minimization without derivatives
- New horizons in sphere-packing theory, part II: Lattice-based derivative-free optimization via global surrogates
- A derivative-free trust-region algorithm for composite nonsmooth optimization
- Performance indicators in multiobjective optimization
- A method for convex black-box integer global optimization
- An algorithmic framework based on primitive directions and nonmonotone line searches for black-box optimization problems with integer variables
- Derivative-free optimization via proximal point methods
- A mesh adaptive direct search algorithm for multiobjective optimization
- Derivative-free robust optimization by outer approximations
- A decoupled first/second-order steps technique for nonconvex nonlinear unconstrained optimization with improved complexity bounds
- A derivative-free Gauss-Newton method
- Derivative free methodologies for circuit worst case analysis
- Linear equalities in blackbox optimization
- A derivative-free trust-funnel method for equality-constrained nonlinear optimization
- Derivative-free robust optimization for circuit design
- Derivative-free methods for mixed-integer constrained optimization problems
- MrDIRECT: a multilevel robust DIRECT algorithm for global optimization problems
- GLODS: global and local optimization using direct search
- The calculus of simplex gradients
- Optimization with hidden constraints and embedded Monte Carlo computations
- MISO: mixed-integer surrogate optimization framework
- A batch, derivative-free algorithm for finding multiple local minima
- On the properties of positive spanning sets and positive bases
- Multivariate stochastic approximation using a simultaneous perturbation gradient approximation
- Direct Search Methods on Parallel Machines
- On the Convergence of the Multidirectional Search Algorithm
- Convergence estimates for iterative minimization methods
- A View of Unconstrained Minimization Algorithms that Do Not Require Derivatives
- A superlinearly convergent algorithm for minimization without evaluating derivatives
- Convergence Properties of the Nelder--Mead Simplex Method in Low Dimensions
- Convergence of the Nelder--Mead Simplex Method to a Nonstationary Point
- Nelder-Mead Simplex Modifications for Simulation Optimization
- Variational Analysis
- Budget-Dependent Convergence Rate of Stochastic Approximation
- Introduction to Stochastic Search and Optimization
- Analysis of Generalized Pattern Searches
- On the Local Convergence of Pattern Search
- A mathematical view of automatic differentiation
- A derivative-free line search and global convergence of Broyden-like method for nonlinear equations
- Trust Region Methods
- Pattern Search Methods for Linearly Constrained Minimization
- Superlinear Convergence and Implicit Filtering
- A multigrid approach to discretized optimization problems
- Manifold Sampling for Optimization of Nonconvex Functions That Are Piecewise Linear Compositions of Smooth Components
- Complexity and global rates of trust-region methods based on probabilistic models
- ASTRO-DF: A Class of Adaptive Sampling Trust-Region Algorithms for Derivative-Free Stochastic Optimization
- BFO, A Trainable Derivative-free Brute Force Optimizer for Nonlinear Bound-constrained Optimization and Equilibrium Computations with Continuous and Discrete Variables
- Derivative-Free and Blackbox Optimization
- Algebraic multigrid methods
- On Sampling Rates in Simulation-Based Recursions
- Approximate norm descent methods for constrained nonlinear systems
- Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods
- A Derivative-Free Trust-Region Algorithm for the Optimization of Functions Smoothed via Gaussian Convolution Using Adaptive Multiple Importance Sampling
- Optimization Methods for Large-Scale Machine Learning
- A Globally Convergent Filter Method for Nonlinear Programming
- A Pattern Search Filter Method for Nonlinear Programming without Derivatives
- On the use of quadratic models in unconstrained minimization without derivatives
- Stochastic finite element methods for partial differential equations with random input data
- Pattern Search Algorithms for Bound Constrained Minimization
- Detection and Remediation of Stagnation in the Nelder--Mead Algorithm Using a Sufficient Decrease Condition
- The Nonstochastic Multiarmed Bandit Problem
- A Globally Convergent Augmented Lagrangian Pattern Search Algorithm for Optimization with General Constraints and Simple Bounds
- New Sequential and Parallel Derivative-Free Algorithms for Unconstrained Minimization
- On the Global Convergence of Derivative-Free Methods for Unconstrained Optimization
- A Class of Trust-Region Methods for Parallel Optimization
- 10.1162/153244303321897663
- An Implicit Filtering Algorithm for Optimization of Functions with Many Local Minima
- A grid algorithm for bound constrained optimization of noisy functions
- Sample mean based index policies by O(log n) regret for the multi-armed bandit problem
- Use of quadratic models with mesh-adaptive direct search for constrained black box optimization
- Derivative-free optimization methods for finite minimax problems
- Fortified-Descent Simplicial Search Method: A General Approach
- Simulation-Based Optimization with Stochastic Approximation Using Common Random Numbers
- Improving the Flexibility and Robustness of Model-based Derivative-free Optimization Solvers
- Kernel-based methods for bandit convex optimization
- A Derivative-Free Trust-Region Method for Biobjective Optimization
- A Survey on Direct Search Methods for Blackbox Optimization and Their Applications
- An Accelerated Method for Derivative-Free Smooth Stochastic Convex Optimization
- A derivative-free 𝒱𝒰-algorithm for convex finite-max problems
- Surrogate Optimization of Computationally Expensive Black-Box Problems with Hidden Constraints
- Exploiting Known Structures to Approximate Normal Cones
- Function Minimization by Interpolation in a Data Table
- On the convergence of a wide range of trust region methods for unconstrained optimization
- A Derivative-Free Algorithm for Inequality Constrained Nonlinear Programming via Smoothing of an $\ell_\infty$ Penalty Function
- Benchmarking Derivative-Free Optimization Algorithms
- Global Convergence of General Derivative-Free Trust-Region Algorithms to First- and Second-Order Critical Points
- A Progressive Barrier for Derivative-Free Nonlinear Programming
- A Stochastic Line Search Method with Expected Complexity Analysis
- Sharp Worst-Case Evaluation Complexity Bounds for Arbitrary-Order Nonconvex Optimization with Inexpensive Constraints
- Numerical Analysis of $\mathcal{V}\mathcal{U}$-Decomposition, $\mathcal{U}$-Gradient, and $\mathcal{U}$-Hessian Approximations
- A Merit Function Approach for Direct Search
- CONORBIT: constrained optimization by radial basis function interpolation in trust regions
- Information-Theoretic Lower Bounds on the Oracle Complexity of Stochastic Convex Optimization
- Surrogate‐based methods for black‐box optimization
- Complete search in continuous global optimization and constraint satisfaction
- Derivative-Free Optimization of Expensive Functions with Computational Error Using Weighted Regression
- Stochastic Convex Optimization with Bandit Feedback
- Multicriteria Optimization
- An efficient method for finding the minimum of a function of several variables without calculating derivatives
- An Optimal Algorithm for Bandit and Zero-Order Convex Optimization with Two-Point Feedback
- Estimating Tangent and Normal Cones Without Calculus
- Calibration by optimization without using derivatives
- Pattern search methods for finite minimax problems
- Nonmonotone derivative-free methods for nonlinear equations
- Constrained derivative-free optimization on thin domains
- Derivative-free optimization: a review of algorithms and comparison of software implementations
- Stochastic online optimization. Single-point and multi-point non-linear multi-armed bandits. Convex and strongly-convex case
- Random gradient-free minimization of convex functions
- Direct search based on probabilistic feasible descent for bound and linearly constrained problems
- Efficient calculation of regular simplex gradients
- Using QR decomposition to obtain a new instance of mesh adaptive direct search with uniformly distributed polling directions
- A generating set search method using curvature information
- Geometry of interpolation sets in derivative free optimization
- Piecewise partially separable functions and a derivative-free algorithm for large scale nonsmooth optimization
- A class of derivative-free nonmonotone optimization algorithms employing coordinate rotations and gradient approximations
- A trust-region-based derivative free algorithm for mixed integer programming
- Random optimization
- Derivative free analogues of the Levenberg-Marquardt and Gauss algorithms for nonlinear least squares approximation
- Convergence results for generalized pattern search algorithms are tight
- Zeroth-order nonconvex stochastic optimization: handling constraints, high dimensionality, and saddle points
- Pattern Search Algorithms for Mixed Variable Programming
- Asynchronous Parallel Pattern Search for Nonlinear Optimization
- On the Lagrange functions of quadratic models that are defined by interpolation*
- A second-order globally convergent direct-search method and its worst-case complexity
- Numerical experience with a derivative-free trust-funnel method for nonlinear optimization problems with general nonlinear constraints
- Trust-Region Methods Without Using Derivatives: Worst Case Complexity and the NonSmooth Case
- Global Convergence of Radial Basis Function Trust-Region Algorithms for Derivative-Free Optimization
- Smoothing and worst-case complexity for direct-search methods in nonsmooth optimization
- Inexact Restoration Method for Derivative-Free Optimization with Smooth Constraints
- Optimization of Convex Functions with Random Pursuit
- On sequential and parallel non-monotone derivative-free algorithms for box constrained optimization
- Bilevel derivative-free optimization and its application to robust optimization
- A Stochastic Radial Basis Function Method for the Global Optimization of Expensive Functions
- On the Oracle Complexity of First-Order and Derivative-Free Algorithms for Smooth Nonconvex Minimization
- Convergence of the Restricted Nelder--Mead Algorithm in Two Dimensions
- A subclass of generating set search with convergence to second-order stationary points
- Coordinate search algorithms in multilevel optimization
- A Linesearch-Based Derivative-Free Approach for Nonsmooth Constrained Optimization
- Convergence of Trust-Region Methods Based on Probabilistic Models
- Global convergence of a derivative-free inexact restoration filter algorithm for nonlinear programming
- Optimal Rates for Zero-Order Convex Optimization: The Power of Two Function Evaluations
- Algorithm 897
- Algorithm 909
- Estimating Derivatives of Noisy Simulations
- A stochastic method for global optimization
- A Nonderivative Version of the Gradient Sampling Algorithm for Nonsmooth Nonconvex Optimization
- Sequential Penalty Derivative-Free Methods for Nonlinear Constrained Optimization
- Self-Correcting Geometry in Model-Based Algorithms for Derivative-Free Unconstrained Optimization
- A Derivative-Free Algorithm for Least-Squares Minimization
- Lipschitz Bandits without the Lipschitz Constant
- A class of derivative-free methods for large-scale nonlinear monotone equations
- An active-set trust-region method for derivative-free nonlinear bound-constrained optimization
- On Choosing Parameters in Retrospective-Approximation Algorithms for Stochastic Root Finding and Simulation Optimization
- Robust Optimization for Unconstrained Simulation-Based Problems
- Estimating Computational Noise
- Global Convergence of Radial Basis Function Trust Region Derivative-Free Algorithms
- Direct Multisearch for Multiobjective Optimization
- On the Convergence of Pattern Search Algorithms
- Implicit Filtering
- A Smoothing Direct Search Method for Monte Carlo-Based Bound Constrained Composite Nonsmooth Optimization
- A Derivative-Free Approach to Constrained Multiobjective Nonsmooth Optimization
- Reducing the Number of Function Evaluations in Mesh Adaptive Direct Search Algorithms
- Solving Nonlinear Programs Without Using Analytic Derivatives
- A Globally Convergent Augmented Lagrangian Algorithm for Optimization with General Constraints and Simple Bounds
- An Application of Chung's Lemma to the Kiefer-Wolfowitz Stochastic Approximation Procedure
- Computing a Trust Region Step
- Parallel Space Decomposition of the Mesh Adaptive Direct Search Algorithm
- Convergence of Mesh Adaptive Direct Search to Second‐Order Stationary Points
- A derivative-free comirror algorithm for convex optimization
- A trust-region derivative-free algorithm for constrained optimization
- Using Sampling and Simplex Derivatives in Pattern Search Methods
- Evaluating Derivatives
- Developments of NEWUOA for minimization without derivatives
- Geometry of sample sets in derivative-free optimization: polynomial regression and underdetermined interpolation
- Using simplex gradients of nonsmooth functions in direct search methods
- Algorithm 856
- OrthoMADS: A Deterministic MADS Instance with Orthogonal Directions
- On the geometry phase in model-based algorithms for derivative-free optimization
- Multiobjective Optimization Through a Series of Single-Objective Formulations
- Introduction to Derivative-Free Optimization
- ORBIT: Optimization by Radial Basis Function Interpolation in Trust-Regions
- Computing Forward-Difference Intervals for Numerical Optimization
- Conditions for convergence of trust region algorithms for nonsmooth optimization
- Stopping criteria for linesearch methods without derivatives
- Stochastic global optimization methods part I: Clustering methods
- Stochastic global optimization methods part II: Multi level methods
- Two-Point Step Size Gradient Methods
- `` Direct Search Solution of Numerical and Statistical Problems
- An Algorithm for Least-Squares Estimation of Nonlinear Parameters
- Rates of Convergence for Stochastic Approximation Type Algorithms
- Dud, A Derivative-Free Algorithm for Nonlinear Least Squares
- On the Number of Iterations of Piyavskii's Global Optimization Algorithm
- Convergence rates of efficient global optimization algorithms
- Regret Analysis of Stochastic and Nonstochastic Multi-armed Bandit Problems
- Stochastic First- and Zeroth-Order Methods for Nonconvex Stochastic Programming
- Mixed-integer nonlinear optimization
- Stationarity Results for Generating Set Search for Linearly Constrained Optimization
- Spectral residual method without gradient information for solving large-scale nonlinear systems of equations
- Second-Order Behavior of Pattern Search
- Precision Control for Generalized Pattern Search Algorithms with Adaptive Precision Function Evaluations
- A Derivative-Free Algorithm for Linearly Constrained Finite Minimax Problems
- Mesh Adaptive Direct Search Algorithms for Constrained Optimization
- Exploiting problem structure in pattern search methods for unconstrained optimization
- Direct Search Based on Probabilistic Descent
- Manifold Sampling for $\ell_1$ Nonconvex Optimization
- Function Minimization Without Evaluating Derivatives--a Review
- A New Method of Constrained Optimization and a Comparison With Other Methods
- A Method for Minimizing a Sum of Squares of Non-Linear Functions Without Calculating Derivatives
- A Comparison of Several Current Optimization Methods, and the use of Transformations in Constrained Problems
- Sequential Search: A Method for Solving Constrained Optimization Problems
- The solution of variational and boundary value problems by the method of local variations
- A New Method for Minimising a Sum of Squares without Calculating Gradients
- Asymptotic Distribution of Stochastic Approximation Procedures
- A Simplex Method for Function Minimization
- Optimizing partially separable functions without derivatives
- Positive Bases for Linear Spaces
- Sequential Application of Simplex Designs in Optimisation and Evolutionary Operation
- A projected derivative-free algorithm for nonlinear equations with convex constraints
- Stochastic Estimation of the Maximum of a Regression Function
- Some aspects of the sequential design of experiments
- A Stochastic Approximation Method
- Approximation Methods which Converge with Probability one
- Multidimensional Stochastic Approximation Methods
- Theory of Positive Linear Dependence
- A method for the solution of certain non-linear problems in least squares
- Probability
- Scattered Data Approximation
- Global optimization
- Simulation optimization: a review of algorithms and applications
- Frame based methods for unconstrained optimization
- A radial basis function method for global optimization
- Wedge trust region method for derivative free optimization.
- Finite-time analysis of the multiarmed bandit problem
- A derivative-free algorithm for bound constrained optimization
- Worst case complexity of direct search under convexity
- A sequential quadratic programming algorithm for equality-constrained optimization without derivatives
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Derivative-free optimization methods