Global optimization using random embeddings
From MaRDI portal
Publication:6160282
DOI10.1007/s10107-022-01871-yzbMath1518.65062arXiv2107.12102OpenAlexW3183926510WikidataQ114228462 ScholiaQ114228462MaRDI QIDQ6160282
Estelle M. Massart, Coralia Cartis, Adilet Otemissov
Publication date: 23 June 2023
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2107.12102
global optimizationrandom subspacesdimensionality reduction techniquesfunctions with low effective dimensionalityconic integral geometry
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30) Optimality conditions for problems involving randomness (49K45)
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Parallel coordinate descent methods for big data optimization
- Learning functions of few arbitrary linear parameters in high dimensions
- Gaussian phase transitions and conic intrinsic volumes: steining the Steiner formula
- Learning non-parametric basis independent models from point queries via low-rank methods
- From Steiner formulas for cones to concentration of intrinsic volumes
- Global convergence rate analysis of unconstrained optimization methods based on probabilistic models
- Sub-sampled Newton methods
- Random projections for conic programs
- Coordinate descent algorithms
- Random gradient-free minimization of convex functions
- Intrinsic volumes of polyhedral cones: a combinatorial perspective
- Direct search based on probabilistic feasible descent for bound and linearly constrained problems
- The asymptotic expansion of a ratio of gamma functions
- Bound-constrained global optimization of functions with low effective dimensionality using multiple random embeddings
- Lectures on Modern Convex Optimization
- Bayesian Optimization in a Billion Dimensions via Random Embeddings
- Optimization of Convex Functions with Random Pursuit
- Optimization with Sparsity-Inducing Penalties
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Convergence of Trust-Region Methods Based on Probabilistic Models
- Computational Advertising: Techniques for Targeting Relevant Ads
- Introduction to Nonlinear Optimization
- Active Subspaces
- Newton Sketch: A Near Linear-Time Optimization Algorithm with Linear-Quadratic Convergence
- Stochastic and Integral Geometry
- Minimization by Random Search Techniques
- High-Dimensional Probability
- Stochastic Three Points Method for Unconstrained Smooth Minimization
- A Supervised Learning Approach Involving Active Subspaces for an Efficient Genetic Algorithm in High-Dimensional Optimization Problems
- Proximal Gradient Methods with Adaptive Subspace Sampling
- A dimensionality reduction technique for unconstrained global optimization of functions with low effective dimensionality
- An investigation of Newton-Sketch and subsampled Newton methods
- Living on the edge: phase transitions in convex programs with random data
- Random Projections for Linear Programming
- Direct Search Based on Probabilistic Descent
- Genetic Algorithms and the Optimal Allocation of Trials
- Handbook of metaheuristics
- Benchmarking optimization software with performance profiles.
- Scalable subspace methods for derivative-free nonlinear least-squares optimization