Zeroth-order optimization with orthogonal random directions
From MaRDI portal
Publication:6038668
DOI10.1007/s10107-022-01866-9zbMath1518.90127arXiv2107.03941OpenAlexW3181693521WikidataQ114228475 ScholiaQ114228475MaRDI QIDQ6038668
Lorenzo Rosasco, Luis Tenorio, Silvia Villa, Cesare Molinari, David Kozak
Publication date: 2 May 2023
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2107.03941
convex programmingstochastic algorithmsrandom searchderivative-free methodsfinite differences approximationzeroth-order optimizationPolyak-Łojasiewicz inequality
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Derivative-free methods and methods using generalized derivatives (90C56)
Related Items
A Zeroth-Order Proximal Stochastic Gradient Method for Weakly Convex Stochastic Optimization, Direct Search Based on Probabilistic Descent in Reduced Spaces
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- How to generate random matrices from the classical compact groups
- On the optimal order of worst case complexity of direct search
- Faster least squares approximation
- On the global optimization properties of finite-difference local descent algorithms
- Quelques propriétés des opérateurs angle-bornes et n-cycliquement monotones
- Stochastic approximation methods for constrained and unconstrained systems
- Global convergence rate analysis of unconstrained optimization methods based on probabilistic models
- Simple statistical gradient-following algorithms for connectionist reinforcement learning
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- A stochastic subspace approach to gradient-free optimization in high dimensions
- A theoretical and empirical comparison of gradient approximations in derivative-free optimization
- Coordinate descent algorithms
- Random gradient-free minimization of convex functions
- Random optimization
- Computational Advertising: Techniques for Targeting Relevant Ads
- Optimal Rates for Zero-Order Convex Optimization: The Power of Two Function Evaluations
- Blendenpik: Supercharging LAPACK's Least-Squares Solver
- Introduction to Derivative-Free Optimization
- Multivariate stochastic approximation using a simultaneous perturbation gradient approximation
- Gradient Convergence in Gradient methods with Errors
- Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods
- On the complexity of parallel coordinate descent
- An Implicit Filtering Algorithm for Optimization of Functions with Many Local Minima
- A Stochastic Line Search Method with Expected Complexity Analysis
- Discrete gradient methods for solving variational image regularisation models
- On the Convergence of Block Coordinate Descent Type Methods
- Stochastic First- and Zeroth-Order Methods for Nonconvex Stochastic Programming
- Stochastic Quasi-Fejér Block-Coordinate Fixed Point Iterations with Random Sweeping
- Direct Search Based on Probabilistic Descent
- Weak convergence of the sequence of successive approximations for nonexpansive mappings
- Stochastic Estimation of the Maximum of a Regression Function
- A Stochastic Approximation Method
- On a Stochastic Approximation Method
- Randomized numerical linear algebra: Foundations and algorithms