Stochastic trust-region algorithm in random subspaces with convergence and expected complexity analyses
From MaRDI portal
Publication:6580002
DOI10.1137/22M1524072MaRDI QIDQ6580002FDOQ6580002
Authors: Kwassi Joseph Dzahini, Stefan M. Wild
Publication date: 29 July 2024
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Recommendations
- Stochastic optimization using a trust-region method and random models
- Stochastic derivative-free optimization using a trust region framework
- Expected decrease for derivative-free algorithms using random subspaces
- A stochastic subspace approach to gradient-free optimization in high dimensions
- Trust-region algorithms: probabilistic complexity and intrinsic noise with applications to subsampling techniques
Nonlinear programming (90C30) Stochastic programming (90C15) Derivative-free methods and methods using generalized derivatives (90C56)
Cites Work
- ASTRO-DF: a class of adaptive sampling trust-region algorithms for derivative-free stochastic optimization
- Extensions of Lipschitz mappings into a Hilbert space
- A basic course in probability theory
- Title not available (Why is that?)
- Adaptive estimation of a quadratic functional by model selection.
- Sharp nonasymptotic bounds on the norm of random matrices with independent entries
- Title not available (Why is that?)
- The Efficient Generation of Random Orthogonal Matrices with an Application to Condition Estimators
- Introduction to Derivative-Free Optimization
- Benchmarking Derivative-Free Optimization Algorithms
- Stochastic optimization using a trust-region method and random models
- Estimating derivatives of noisy simulations
- Stochastic Nelder-Mead simplex method -- a new globally convergent direct search method for simulation optimization
- Sparser Johnson-Lindenstrauss transforms
- A stochastic line search method with expected complexity analysis
- Derivative-free and blackbox optimization
- The random matrix theory of the classical compact groups
- Sketching as a tool for numerical linear algebra
- Derivative-free optimization methods
- Stochastic trust-region methods with trust-region radius depending on probabilistic models
- Stochastic mesh adaptive direct search for blackbox optimization using probabilistic estimates
- Scalable subspace methods for derivative-free nonlinear least-squares optimization
- Global Convergence Rate Analysis of a Generic Line Search Algorithm with Noise
- Expected complexity analysis of stochastic direct-search
- Constrained stochastic blackbox optimization using a progressive barrier and probabilistic estimates
- A stochastic Levenberg-Marquardt method using random models with complexity results
- Adaptive sampling quasi-Newton methods for zeroth-order stochastic optimization
- Optimization by moving ridge functions: derivative-free optimization for computationally intensive functions
- Direct Search Based on Probabilistic Descent in Reduced Spaces
Cited In (2)
This page was built for publication: Stochastic trust-region algorithm in random subspaces with convergence and expected complexity analyses
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6580002)