Scalable subspace methods for derivative-free nonlinear least-squares optimization
From MaRDI portal
Publication:6038650
DOI10.1007/s10107-022-01836-1arXiv2102.12016OpenAlexW3133264924MaRDI QIDQ6038650
Lindon Roberts, Coralia Cartis
Publication date: 2 May 2023
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2102.12016
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Derivative-free methods and methods using generalized derivatives (90C56)
Related Items
Direct Search Based on Probabilistic Descent in Reduced Spaces, Global optimization using random embeddings, Quadratic regularization methods with finite-difference gradient approximations
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Finding structure with randomness: Probabilistic algorithms for constructing approximate matrix decompositions
- Sharp nonasymptotic bounds on the norm of random matrices with independent entries
- Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization
- Computation of sparse low degree interpolating polynomials and their application to derivative-free optimization
- Worst case complexity of direct search
- On trust region methods for unconstrained minimization without derivatives
- Global convergence rate analysis of unconstrained optimization methods based on probabilistic models
- Stochastic optimization using a trust-region method and random models
- A globally convergent algorithm for nonconvex optimization based on block coordinate update
- Sub-sampled Newton methods
- Connected components in random graphs with given expected degree sequences
- Least Frobenius norm updating of quadratic models that satisfy interpolation conditions
- Inexact derivative-free optimization for bilevel learning
- Stochastic quasi-gradient methods: variance reduction via Jacobian sketching
- A stochastic subspace approach to gradient-free optimization in high dimensions
- A theoretical and empirical comparison of gradient approximations in derivative-free optimization
- A derivative-free Gauss-Newton method
- Coordinate descent algorithms
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- Random gradient-free minimization of convex functions
- Direct search based on probabilistic feasible descent for bound and linearly constrained problems
- Geometry of interpolation sets in derivative free optimization
- Bound-constrained global optimization of functions with low effective dimensionality using multiple random embeddings
- Bayesian Optimization in a Billion Dimensions via Random Embeddings
- Trust-Region Methods Without Using Derivatives: Worst Case Complexity and the NonSmooth Case
- Convergence of Trust-Region Methods Based on Probabilistic Models
- Computational Advertising: Techniques for Targeting Relevant Ads
- Block Stochastic Gradient Iteration for Convex and Nonconvex Optimization
- Newton Sketch: A Near Linear-Time Optimization Algorithm with Linear-Quadratic Convergence
- A Derivative-Free Algorithm for Least-Squares Minimization
- Randomized Algorithms for Matrices and Data
- Levenberg--Marquardt Methods Based on Probabilistic Gradient Models and Inexact Subproblem Solution, with Application to Data Assimilation
- Sparser Johnson-Lindenstrauss Transforms
- Introduction to Derivative-Free Optimization
- Trust Region Methods
- Complexity and global rates of trust-region methods based on probabilistic models
- Parallel Selective Algorithms for Nonconvex Big Data Optimization
- A Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming
- Advances and Trends in Optimization with Engineering Applications
- Detection and Remediation of Stagnation in the Nelder--Mead Algorithm Using a Sufficient Decrease Condition
- Improving the Flexibility and Robustness of Model-based Derivative-free Optimization Solvers
- Stochastic Three Points Method for Unconstrained Smooth Minimization
- A Derivative-Free Method for Structured Optimization Problems
- Error bounds for overdetermined and underdetermined generalized centred simplex gradients
- Inexact Block Coordinate Descent Algorithms for Nonsmooth Nonconvex Optimization
- An investigation of Newton-Sketch and subsampled Newton methods
- Benchmarking Derivative-Free Optimization Algorithms
- Global Convergence of General Derivative-Free Trust-Region Algorithms to First- and Second-Order Critical Points
- Derivative-free optimization methods
- Stochastic First- and Zeroth-Order Methods for Nonconvex Stochastic Programming
- Direct Search Based on Probabilistic Descent
- Optimizing partially separable functions without derivatives
- Benchmarking optimization software with performance profiles.
- Optimization by moving ridge functions: derivative-free optimization for computationally intensive functions
- Two decades of blackbox optimization applications