Accelerating the cubic regularization of Newton's method on convex problems
DOI10.1007/S10107-006-0089-XzbMATH Open1167.90013OpenAlexW1977109023MaRDI QIDQ995787FDOQ995787
Authors: J. Martínez
Publication date: 10 September 2007
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10107-006-0089-x
Recommendations
- Minimizing uniformly convex functions by cubic regularization of Newton method
- Cubic regularization of Newton method and its global performance
- Regularized Newton Method with Global \({\boldsymbol{\mathcal{O}(1/{k}^2)}}\) Convergence
- A cubic regularization of Newton's method with finite difference Hessian approximations
- On the convergence of a modified regularized Newton method for convex optimization with singular solutions
convex optimizationcondition numberNewton's methodunconstrained minimizationworst-case complexitycubic regularizationglobal complexity boundsnon-degenerate problems
Convex programming (90C25) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37) Newton-type methods (49M15) Implicit function theorems; global Newton methods on manifolds (58C15)
Cites Work
Cited In (88)
- On the redundancy of Hessian nonsingularity for linear convergence rate of the Newton method applied to the minimization of convex functions
- Super-Universal Regularized Newton Method
- Higher-order methods for convex-concave min-max optimization and monotone variational inequalities
- Practical perspectives on symplectic accelerated optimization
- Inexact tensor methods and their application to stochastic convex optimization
- Two modified adaptive cubic regularization algorithms by using the nonmonotone Armijo-type line search
- A new homotopy proximal variable-metric framework for composite convex minimization
- Set-limited functions and polynomial-time interior-point methods
- A diagonal finite element-projection-proximal gradient algorithm for elliptic optimal control problem
- Adaptive Third-Order Methods for Composite Convex Optimization
- High-order optimization methods for fully composite problems
- Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods
- A search-free \(O(1/k^{3/2})\) homotopy inexact proximal-Newton extragradient algorithm for monotone variational inequalities
- Nesterov's acceleration for approximate Newton
- High-order methods beyond the classical complexity bounds: inexact high-order proximal-point methods
- Gradient regularization of Newton method with Bregman distances
- Accelerated extra-gradient descent: a novel accelerated first-order method
- Near-optimal tensor methods for minimizing the gradient norm of convex functions and accelerated primal–dual tensor methods
- Inexact accelerated high-order proximal-point methods
- An adaptive nonmonotone trust region method based on a modified scalar approximation of the Hessian in the successive quadratic subproblems
- Perseus: a simple and optimal high-order method for variational inequalities
- Inexact high-order proximal-point methods with auxiliary search procedure
- Regularized Newton Method with Global \({\boldsymbol{\mathcal{O}(1/{k}^2)}}\) Convergence
- An accelerated regularized Chebyshev-Halley method for unconstrained optimization
- An optimal high-order tensor method for convex optimization
- Cubic-regularization counterpart of a variable-norm trust-region method for unconstrained minimization
- Solving Large-Scale Cubic Regularization by a Generalized Eigenvalue Problem
- Linear coupling: an ultimate unification of gradient and mirror descent
- Inexact basic tensor methods for some classes of convex optimization problems
- On inexact solution of auxiliary problems in tensor methods for convex optimization
- Generalized self-concordant functions: a recipe for Newton-type methods
- Contracting proximal methods for smooth convex optimization
- Cubic regularization of Newton method and its global performance
- First-order methods for convex optimization
- Evaluation complexity of adaptive cubic regularization methods for convex unconstrained optimization
- Oracle complexity of second-order methods for smooth convex optimization
- Title not available (Why is that?)
- Some properties of smooth convex functions and Newton's method
- Local convergence of tensor methods
- A cubic regularization of Newton's method with finite difference Hessian approximations
- Accelerated Optimization in the PDE Framework: Formulations for the Manifold of Diffeomorphisms
- Accelerated Optimization in the PDE Framework Formulations for the Active Contour Case
- Accelerated regularized Newton methods for minimizing composite convex functions
- A note on inexact gradient and Hessian conditions for cubic regularized Newton's method
- Newton-type methods for non-convex optimization under inexact Hessian information
- Minimizing uniformly convex functions by cubic regularization of Newton method
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Smoothness parameter of power of Euclidean norm
- Cubic regularized Newton method for the saddle point models: a global and local convergence analysis
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Interior-point methods for nonconvex nonlinear programming: cubic regularization
- Separable cubic modeling and a trust-region strategy for unconstrained minimization with impact in global optimization
- New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization
- An accelerated first-order method with complexity analysis for solving cubic regularization subproblems
- Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences
- A hybrid proximal extragradient self-concordant primal barrier method for monotone variational inequalities
- Superfast second-order methods for unconstrained convex optimization
- Implementable tensor methods in unconstrained convex optimization
- Tensor methods for minimizing convex functions with Hölder continuous higher-order derivatives
- An improvement of adaptive cubic regularization method for unconstrained optimization problems
- Global complexity bound analysis of the Levenberg-Marquardt method for nonsmooth equations and its application to the nonlinear complementarity problem
- A trust region algorithm with adaptive cubic regularization methods for nonsmooth convex minimization
- On global minimizers of quadratic functions with cubic regularization
- An accelerated hybrid proximal extragradient method for convex optimization and its implications to second-order methods
- Global optimality conditions for cubic minimization problem with box or binary constraints
- A control-theoretic perspective on optimal high-order optimization
- Global optimality conditions for cubic minimization problems with cubic constraints
- On the consistent path problem
- Nonsmooth optimization using Taylor-like models: error bounds, convergence, and termination criteria
- Cubic regularization methods with second-order complexity guarantee based on a new subproblem reformulation
- Gradient methods for minimizing composite functions
- Random Coordinate Descent Methods for Nonseparable Composite Optimization
- Complexity analysis of interior point algorithms for non-Lipschitz and nonconvex minimization
- A unified adaptive tensor approximation scheme to accelerate composite convex optimization
- Unveiling the relation between herding and liquidity with trader lead-lag networks
- Complexity bounds for second-order optimality in unconstrained optimization
- Global sufficient optimality conditions for a special cubic minimization problem
- An adaptive high order method for finding third-order critical points of nonconvex optimization
- Regularized Newton methods for minimizing functions with Hölder continuous hessians
- Deterministic and stochastic primal-dual subgradient algorithms for uniformly convex minimization
- Unified acceleration of high-order algorithms under general Hölder continuity
- Tensor methods for finding approximate stationary points of convex functions
- On the convergence of a modified regularized Newton method for convex optimization with singular solutions
- Adaptive Hamiltonian variational integrators and applications to symplectic accelerated optimization
- A variational formulation of accelerated optimization on Riemannian manifolds
- Finding extremals of Lagrangian actions
- Finding geodesics joining given points
This page was built for publication: Accelerating the cubic regularization of Newton's method on convex problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q995787)