Accelerating the cubic regularization of Newton's method on convex problems
From MaRDI portal
Recommendations
- Minimizing uniformly convex functions by cubic regularization of Newton method
- Cubic regularization of Newton method and its global performance
- Regularized Newton Method with Global \({\boldsymbol{\mathcal{O}(1/{k}^2)}}\) Convergence
- A cubic regularization of Newton's method with finite difference Hessian approximations
- On the convergence of a modified regularized Newton method for convex optimization with singular solutions
Cites work
- scientific article; zbMATH DE number 3709086 (Why is no real title available?)
- scientific article; zbMATH DE number 852532 (Why is no real title available?)
- scientific article; zbMATH DE number 3381785 (Why is no real title available?)
- scientific article; zbMATH DE number 3052543 (Why is no real title available?)
- Cubic regularization of Newton method and its global performance
- Introductory lectures on convex optimization. A basic course.
- Trust Region Methods
Cited in
(89)- Finding extremals of Lagrangian actions
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Newton-type methods for non-convex optimization under inexact Hessian information
- On the quadratic convergence of the cubic regularization method under a local error bound condition
- Complexity analysis of interior point algorithms for non-Lipschitz and nonconvex minimization
- Tensor methods for minimizing convex functions with Hölder continuous higher-order derivatives
- New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization
- Unified acceleration of high-order algorithms under general Hölder continuity
- Accelerated Optimization in the PDE Framework Formulations for the Active Contour Case
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- An accelerated first-order method with complexity analysis for solving cubic regularization subproblems
- Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences
- Global sufficient optimality conditions for a special cubic minimization problem
- Superfast second-order methods for unconstrained convex optimization
- A unified adaptive tensor approximation scheme to accelerate composite convex optimization
- Tensor methods for finding approximate stationary points of convex functions
- Cubic-regularization counterpart of a variable-norm trust-region method for unconstrained minimization
- An improvement of adaptive cubic regularization method for unconstrained optimization problems
- Global optimality conditions for cubic minimization problem with box or binary constraints
- On inexact solution of auxiliary problems in tensor methods for convex optimization
- A control-theoretic perspective on optimal high-order optimization
- Global optimality conditions for cubic minimization problems with cubic constraints
- Minimizing uniformly convex functions by cubic regularization of Newton method
- Some properties of smooth convex functions and Newton's method
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Accelerated regularized Newton methods for minimizing composite convex functions
- An adaptive high order method for finding third-order critical points of nonconvex optimization
- Local convergence of tensor methods
- Gradient methods for minimizing composite functions
- On the consistent path problem
- A cubic regularization of Newton's method with finite difference Hessian approximations
- On the convergence of a modified regularized Newton method for convex optimization with singular solutions
- Generalized self-concordant functions: a recipe for Newton-type methods
- Separable cubic modeling and a trust-region strategy for unconstrained minimization with impact in global optimization
- Global complexity bound analysis of the Levenberg-Marquardt method for nonsmooth equations and its application to the nonlinear complementarity problem
- Oracle complexity of second-order methods for smooth convex optimization
- Random Coordinate Descent Methods for Nonseparable Composite Optimization
- A trust region algorithm with adaptive cubic regularization methods for nonsmooth convex minimization
- Regularized Newton methods for minimizing functions with Hölder continuous hessians
- Cubic regularization of Newton method and its global performance
- Accelerated Optimization in the PDE Framework: Formulations for the Manifold of Diffeomorphisms
- Linear coupling: an ultimate unification of gradient and mirror descent
- On global minimizers of quadratic functions with cubic regularization
- Smoothness parameter of power of Euclidean norm
- Unveiling the relation between herding and liquidity with trader lead-lag networks
- Solving Large-Scale Cubic Regularization by a Generalized Eigenvalue Problem
- Finding geodesics joining given points
- Deterministic and stochastic primal-dual subgradient algorithms for uniformly convex minimization
- First-order methods for convex optimization
- Contracting proximal methods for smooth convex optimization
- Inexact basic tensor methods for some classes of convex optimization problems
- Implementable tensor methods in unconstrained convex optimization
- A note on inexact gradient and Hessian conditions for cubic regularized Newton's method
- Nonsmooth optimization using Taylor-like models: error bounds, convergence, and termination criteria
- Adaptive Hamiltonian variational integrators and applications to symplectic accelerated optimization
- An accelerated hybrid proximal extragradient method for convex optimization and its implications to second-order methods
- Complexity bounds for second-order optimality in unconstrained optimization
- Cubic regularized Newton method for the saddle point models: a global and local convergence analysis
- A variational formulation of accelerated optimization on Riemannian manifolds
- Evaluation complexity of adaptive cubic regularization methods for convex unconstrained optimization
- An optimal high-order tensor method for convex optimization
- Interior-point methods for nonconvex nonlinear programming: cubic regularization
- A hybrid proximal extragradient self-concordant primal barrier method for monotone variational inequalities
- scientific article; zbMATH DE number 7370630 (Why is no real title available?)
- A search-free \(O(1/k^{3/2})\) homotopy inexact proximal-Newton extragradient algorithm for monotone variational inequalities
- On the redundancy of Hessian nonsingularity for linear convergence rate of the Newton method applied to the minimization of convex functions
- Practical perspectives on symplectic accelerated optimization
- Two modified adaptive cubic regularization algorithms by using the nonmonotone Armijo-type line search
- Regularized Newton Method with Global \({\boldsymbol{\mathcal{O}(1/{k}^2)}}\) Convergence
- Inexact accelerated high-order proximal-point methods
- A new homotopy proximal variable-metric framework for composite convex minimization
- Higher-order methods for convex-concave min-max optimization and monotone variational inequalities
- Gradient regularization of Newton method with Bregman distances
- Near-optimal tensor methods for minimizing the gradient norm of convex functions and accelerated primal–dual tensor methods
- Nesterov's acceleration for approximate Newton
- Inexact tensor methods and their application to stochastic convex optimization
- Accelerated extra-gradient descent: a novel accelerated first-order method
- An accelerated regularized Chebyshev-Halley method for unconstrained optimization
- Set-limited functions and polynomial-time interior-point methods
- Super-Universal Regularized Newton Method
- High-order optimization methods for fully composite problems
- An adaptive nonmonotone trust region method based on a modified scalar approximation of the Hessian in the successive quadratic subproblems
- Adaptive Third-Order Methods for Composite Convex Optimization
- Cubic regularization methods with second-order complexity guarantee based on a new subproblem reformulation
- High-order methods beyond the classical complexity bounds: inexact high-order proximal-point methods
- Inexact high-order proximal-point methods with auxiliary search procedure
- Perseus: a simple and optimal high-order method for variational inequalities
- A diagonal finite element-projection-proximal gradient algorithm for elliptic optimal control problem
- Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods
This page was built for publication: Accelerating the cubic regularization of Newton's method on convex problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q995787)