Accelerating the cubic regularization of Newton's method on convex problems
From MaRDI portal
Publication:995787
DOI10.1007/s10107-006-0089-xzbMath1167.90013OpenAlexW1977109023MaRDI QIDQ995787
Publication date: 10 September 2007
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10107-006-0089-x
convex optimizationNewton's methodcondition numberunconstrained minimizationworst-case complexitycubic regularizationglobal complexity boundsnon-degenerate problems
Convex programming (90C25) Nonlinear programming (90C30) Newton-type methods (49M15) Numerical methods based on nonlinear programming (49M37) Implicit function theorems; global Newton methods on manifolds (58C15)
Related Items
An improvement of adaptive cubic regularization method for unconstrained optimization problems ⋮ Evaluation complexity of adaptive cubic regularization methods for convex unconstrained optimization ⋮ Tensor methods for finding approximate stationary points of convex functions ⋮ Inexact basic tensor methods for some classes of convex optimization problems ⋮ Local convergence of tensor methods ⋮ A Hybrid Proximal Extragradient Self-Concordant Primal Barrier Method for Monotone Variational Inequalities ⋮ Cubic regularized Newton method for the saddle point models: a global and local convergence analysis ⋮ Accelerated Optimization in the PDE Framework: Formulations for the Manifold of Diffeomorphisms ⋮ A New Homotopy Proximal Variable-Metric Framework for Composite Convex Minimization ⋮ Finding geodesics joining given points ⋮ Accelerated Extra-Gradient Descent: A Novel Accelerated First-Order Method ⋮ Unnamed Item ⋮ Gradient methods for minimizing composite functions ⋮ Superfast second-order methods for unconstrained convex optimization ⋮ Cubic-regularization counterpart of a variable-norm trust-region method for unconstrained minimization ⋮ Smoothness parameter of power of Euclidean norm ⋮ A diagonal finite element-projection-proximal gradient algorithm for elliptic optimal control problem ⋮ Two modified adaptive cubic regularization algorithms by using the nonmonotone Armijo-type line search ⋮ Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods ⋮ Practical perspectives on symplectic accelerated optimization ⋮ Regularized Newton Method with Global \({\boldsymbol{\mathcal{O}(1/{k}^2)}}\) Convergence ⋮ Unnamed Item ⋮ Global complexity bound analysis of the Levenberg-Marquardt method for nonsmooth equations and its application to the nonlinear complementarity problem ⋮ Super-Universal Regularized Newton Method ⋮ Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods ⋮ First-order methods for convex optimization ⋮ Adaptive Third-Order Methods for Composite Convex Optimization ⋮ Random Coordinate Descent Methods for Nonseparable Composite Optimization ⋮ Inexact accelerated high-order proximal-point methods ⋮ A trust region algorithm with adaptive cubic regularization methods for nonsmooth convex minimization ⋮ Global optimality conditions for cubic minimization problems with cubic constraints ⋮ Newton-type methods for non-convex optimization under inexact Hessian information ⋮ Global sufficient optimality conditions for a special cubic minimization problem ⋮ Global optimality conditions for cubic minimization problem with box or binary constraints ⋮ Interior-point methods for nonconvex nonlinear programming: cubic regularization ⋮ A Unified Adaptive Tensor Approximation Scheme to Accelerate Composite Convex Optimization ⋮ Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity ⋮ Nonsmooth optimization using Taylor-like models: error bounds, convergence, and termination criteria ⋮ Complexity bounds for second-order optimality in unconstrained optimization ⋮ Contracting Proximal Methods for Smooth Convex Optimization ⋮ Implementable tensor methods in unconstrained convex optimization ⋮ Accelerated Optimization in the PDE Framework Formulations for the Active Contour Case ⋮ On the Consistent Path Problem ⋮ Solving Large-Scale Cubic Regularization by a Generalized Eigenvalue Problem ⋮ Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions ⋮ Some properties of smooth convex functions and Newton's method ⋮ Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results ⋮ Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians ⋮ Minimizing uniformly convex functions by cubic regularization of Newton method ⋮ Unveiling the relation between herding and liquidity with trader lead-lag networks ⋮ Complexity analysis of interior point algorithms for non-Lipschitz and nonconvex minimization ⋮ New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization ⋮ Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences ⋮ Separable cubic modeling and a trust-region strategy for unconstrained minimization with impact in global optimization ⋮ Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives ⋮ Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent ⋮ An adaptive nonmonotone trust region method based on a modified scalar approximation of the Hessian in the successive quadratic subproblems ⋮ An adaptive high order method for finding third-order critical points of nonconvex optimization ⋮ On global minimizers of quadratic functions with cubic regularization ⋮ Generalized self-concordant functions: a recipe for Newton-type methods ⋮ Oracle complexity of second-order methods for smooth convex optimization ⋮ Unified Acceleration of High-Order Algorithms under General Hölder Continuity ⋮ A control-theoretic perspective on optimal high-order optimization ⋮ Finding extremals of Lagrangian actions ⋮ Adaptive Hamiltonian Variational Integrators and Applications to Symplectic Accelerated Optimization ⋮ On inexact solution of auxiliary problems in tensor methods for convex optimization ⋮ Inexact High-Order Proximal-Point Methods with Auxiliary Search Procedure ⋮ A Variational Formulation of Accelerated Optimization on Riemannian Manifolds ⋮ Deterministic and stochastic primal-dual subgradient algorithms for uniformly convex minimization ⋮ Higher-Order Methods for Convex-Concave Min-Max Optimization and Monotone Variational Inequalities ⋮ High-Order Optimization Methods for Fully Composite Problems ⋮ An Optimal High-Order Tensor Method for Convex Optimization
Cites Work