A one-bit, comparison-based gradient estimator
From MaRDI portal
Publication:2155805
Recommendations
- Small errors in random zeroth-order optimization are imaginary
- Minimax efficient finite-difference stochastic gradient estimators using black-box function evaluations
- Black-box reductions for zeroth-order gradient algorithms to achieve lower query complexity
- Zeroth-order regularized optimization (ZORO): approximately sparse gradients and adaptive sampling
Cites work
- scientific article; zbMATH DE number 3910150 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- scientific article; zbMATH DE number 6276119 (Why is no real title available?)
- A dimensionality reduction technique for unconstrained global optimization of functions with low effective dimensionality
- Active subspaces. Emerging ideas for dimension reduction in parameter studies
- Adaptive stochastic approximation by the simultaneous perturbation method
- Bayesian optimization in a billion dimensions via random embeddings
- CoSaMP: Iterative signal recovery from incomplete and inaccurate samples
- Derivative-free optimization methods
- Minimax number of strata for online stratified sampling given noisy samples
- Parallel distributed block coordinate descent methods based on pairwise comparison oracle
- Preference-based reinforcement learning: a formal framework and a policy iteration algorithm
- Restricted strong convexity and its applications to convergence analysis of gradient-type methods in convex optimization
- Robust 1-bit Compressed Sensing and Sparse Logistic Regression: A Convex Programming Approach
- Zeroth-order regularized optimization (ZORO): approximately sparse gradients and adaptive sampling
Cited in
(4)
This page was built for publication: A one-bit, comparison-based gradient estimator
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2155805)