A cubic regularization of Newton's method with finite difference Hessian approximations
From MaRDI portal
Publication:2138398
DOI10.1007/S11075-021-01200-YzbMATH Open1492.65166OpenAlexW3205980973WikidataQ114224288 ScholiaQ114224288MaRDI QIDQ2138398FDOQ2138398
Authors: Yanyan Li
Publication date: 11 May 2022
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-021-01200-y
Recommendations
- Cubic regularization of Newton method and its global performance
- Accelerating the cubic regularization of Newton's method on convex problems
- Adaptive cubic regularization methods with dynamic inexact Hessian information and applications to finite-sum minimization
- Newton-type methods for non-convex optimization under inexact Hessian information
- A Newton-like method with mixed factorizations and cubic regularization for unconstrained minimization
Cites Work
- Testing Unconstrained Optimization Software
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Nonmonotone Line Search Technique for Newton’s Method
- On the complexity of steepest descent, Newton's and regularized Newton's methods for nonconvex unconstrained optimization problems
- On the worst-case evaluation complexity of non-monotone line search algorithms
- Title not available (Why is that?)
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Cubic regularization of Newton method and its global performance
- Cubic-regularization counterpart of a variable-norm trust-region method for unconstrained minimization
- A trust region algorithm with a worst-case iteration complexity of \(\mathcal{O}(\epsilon ^{-3/2})\) for nonconvex optimization
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- On the oracle complexity of first-order and derivative-free algorithms for smooth nonconvex minimization
- The use of quadratic regularization with a cubic descent condition for unconstrained optimization
- An improvement of adaptive cubic regularization method for unconstrained optimization problems
- Fast derivatives of likelihood functionals for ODE based models using adjoint-state method
- Regularized Newton methods for minimizing functions with Hölder continuous hessians
- A line-search algorithm inspired by the adaptive cubic regularization framework and complexity analysis
- A cubic regularization algorithm for unconstrained optimization using line search and nonmonotone techniques
- Adaptive cubic regularization methods with dynamic inexact Hessian information and applications to finite-sum minimization
- Lower bounds for finding stationary points I
- A note on inexact gradient and Hessian conditions for cubic regularized Newton's method
Cited In (9)
- Cubic-regularization counterpart of a variable-norm trust-region method for unconstrained minimization
- Cubic overestimation and secant updating for unconstrained optimization of \(C^{2,1}\) functions
- Gradient descent finds the cubic-regularized nonconvex Newton step
- A note on inexact gradient and Hessian conditions for cubic regularized Newton's method
- Stochastic analysis of an adaptive cubic regularization method under inexact gradient evaluations and dynamic Hessian accuracy
- Quadratic regularization methods with finite-difference gradient approximations
- Adaptive cubic regularization methods with dynamic inexact Hessian information and applications to finite-sum minimization
- Accelerating the cubic regularization of Newton's method on convex problems
- Stochastic variance-reduced cubic regularization methods
Uses Software
This page was built for publication: A cubic regularization of Newton's method with finite difference Hessian approximations
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2138398)