Pages that link to "Item:Q2902870"
From MaRDI portal
The following pages link to On the Oracle Complexity of First-Order and Derivative-Free Algorithms for Smooth Nonconvex Minimization (Q2902870):
Displaying 27 items.
- Complexity bounds for second-order optimality in unconstrained optimization (Q657654) (← links)
- Worst case complexity of direct search (Q743632) (← links)
- Global convergence rate analysis of unconstrained optimization methods based on probabilistic models (Q1646566) (← links)
- A derivative-free trust-region algorithm for composite nonsmooth optimization (Q2013620) (← links)
- Derivative-free restrictively preconditioned conjugate gradient path method without line search technique for solving linear equality constrained optimization (Q2013811) (← links)
- Worst-case complexity bounds of directional direct-search methods for multiobjective optimization (Q2026717) (← links)
- A cubic regularization of Newton's method with finite difference Hessian approximations (Q2138398) (← links)
- Projected adaptive cubic regularization algorithm with derivative-free filter technique for box constrained optimization (Q2244360) (← links)
- An interior affine scaling cubic regularization algorithm for derivative-free optimization subject to bound constraints (Q2357423) (← links)
- On the complexity of finding first-order critical points in constrained nonlinear optimization (Q2452373) (← links)
- Inexact accelerated high-order proximal-point methods (Q2689812) (← links)
- Evaluation Complexity for Nonlinear Constrained Optimization Using Unscaled KKT Conditions and High-Order Models (Q2802144) (← links)
- A second-order globally convergent direct-search method and its worst-case complexity (Q2810113) (← links)
- On the worst-case complexity of nonlinear stepsize control algorithms for convex unconstrained optimization (Q2815548) (← links)
- Trust-Region Methods Without Using Derivatives: Worst Case Complexity and the NonSmooth Case (Q2826817) (← links)
- Cubic overestimation and secant updating for unconstrained optimization of<i>C</i><sup>2, 1</sup>functions (Q2926071) (← links)
- Stochastic analysis of an adaptive cubic regularization method under inexact gradient evaluations and dynamic Hessian accuracy (Q5034938) (← links)
- A note on the worst-case complexity of nonlinear stepsize control methods for convex smooth unconstrained optimization (Q5085238) (← links)
- A Unified Adaptive Tensor Approximation Scheme to Accelerate Composite Convex Optimization (Q5131958) (← links)
- Derivative-free optimization methods (Q5230522) (← links)
- Adaptive Regularization Algorithms with Inexact Evaluations for Nonconvex Optimization (Q5244400) (← links)
- A note about the complexity of minimizing Nesterov's smooth Chebyshev–Rosenbrock function (Q5299905) (← links)
- Worst case complexity of direct search under convexity (Q5962720) (← links)
- Direct Search Based on Probabilistic Descent in Reduced Spaces (Q6071887) (← links)
- Quadratic regularization methods with finite-difference gradient approximations (Q6175465) (← links)
- Derivative-free separable quadratic modeling and cubic regularization for unconstrained optimization (Q6489314) (← links)
- Worst case complexity bounds for linesearch-type derivative-free algorithms (Q6636792) (← links)