Evaluation complexity of adaptive cubic regularization methods for convex unconstrained optimization
From MaRDI portal
Recommendations
- An adaptive cubic regularization algorithm for nonconvex optimization with convex constraints and its function-evaluation complexity
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- On the oracle complexity of first-order and derivative-free algorithms for smooth nonconvex minimization
- Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
Cites work
- scientific article; zbMATH DE number 3928227 (Why is no real title available?)
- Accelerating the cubic regularization of Newton's method on convex problems
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Affine conjugate adaptive Newton methods for nonlinear elastomechanics
- Cubic regularization of Newton method and its global performance
- On the complexity of steepest descent, Newton's and regularized Newton's methods for nonconvex unconstrained optimization problems
- Trust Region Methods
Cited in
(21)- Parameter-free accelerated gradient descent for nonconvex minimization
- Linear-time convexity test for low-order piecewise polynomials
- Oracle complexity of second-order methods for smooth convex optimization
- On the worst-case complexity of nonlinear stepsize control algorithms for convex unconstrained optimization
- An adaptive cubic regularization algorithm for nonconvex optimization with convex constraints and its function-evaluation complexity
- Adaptive Third-Order Methods for Composite Convex Optimization
- Newton-type methods for non-convex optimization under inexact Hessian information
- Minimizing uniformly convex functions by cubic regularization of Newton method
- On the evaluation complexity of cubic regularization methods for potentially rank-deficient nonlinear least-squares problems and its relevance to constrained nonlinear optimization
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Algebraic rules for quadratic regularization of Newton's method
- On the worst-case evaluation complexity of non-monotone line search algorithms
- Practical inexact proximal quasi-Newton method with global complexity analysis
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Implementable tensor methods in unconstrained convex optimization
- Worst-case evaluation complexity of regularization methods for smooth unconstrained optimization using Hölder continuous gradients
- A note on the worst-case complexity of nonlinear stepsize control methods for convex smooth unconstrained optimization
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- A unified adaptive tensor approximation scheme to accelerate composite convex optimization
- A sequential adaptive regularisation using cubics algorithm for solving nonlinear equality constrained optimization
- An accelerated regularized Chebyshev-Halley method for unconstrained optimization
This page was built for publication: Evaluation complexity of adaptive cubic regularization methods for convex unconstrained optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2885470)