Accelerating the cubic regularization of Newton's method on convex problems

From MaRDI portal
Publication:995787

DOI10.1007/s10107-006-0089-xzbMath1167.90013OpenAlexW1977109023MaRDI QIDQ995787

J. Martínez

Publication date: 10 September 2007

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1007/s10107-006-0089-x




Related Items

An improvement of adaptive cubic regularization method for unconstrained optimization problemsEvaluation complexity of adaptive cubic regularization methods for convex unconstrained optimizationTensor methods for finding approximate stationary points of convex functionsInexact basic tensor methods for some classes of convex optimization problemsLocal convergence of tensor methodsA Hybrid Proximal Extragradient Self-Concordant Primal Barrier Method for Monotone Variational InequalitiesCubic regularized Newton method for the saddle point models: a global and local convergence analysisAccelerated Optimization in the PDE Framework: Formulations for the Manifold of DiffeomorphismsA New Homotopy Proximal Variable-Metric Framework for Composite Convex MinimizationFinding geodesics joining given pointsAccelerated Extra-Gradient Descent: A Novel Accelerated First-Order MethodUnnamed ItemGradient methods for minimizing composite functionsSuperfast second-order methods for unconstrained convex optimizationCubic-regularization counterpart of a variable-norm trust-region method for unconstrained minimizationSmoothness parameter of power of Euclidean normA diagonal finite element-projection-proximal gradient algorithm for elliptic optimal control problemTwo modified adaptive cubic regularization algorithms by using the nonmonotone Armijo-type line searchFactor-\(\sqrt{2}\) acceleration of accelerated gradient methodsPractical perspectives on symplectic accelerated optimizationRegularized Newton Method with Global \({\boldsymbol{\mathcal{O}(1/{k}^2)}}\) ConvergenceUnnamed ItemGlobal complexity bound analysis of the Levenberg-Marquardt method for nonsmooth equations and its application to the nonlinear complementarity problemSuper-Universal Regularized Newton MethodConvergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methodsFirst-order methods for convex optimizationAdaptive Third-Order Methods for Composite Convex OptimizationRandom Coordinate Descent Methods for Nonseparable Composite OptimizationInexact accelerated high-order proximal-point methodsA trust region algorithm with adaptive cubic regularization methods for nonsmooth convex minimizationGlobal optimality conditions for cubic minimization problems with cubic constraintsNewton-type methods for non-convex optimization under inexact Hessian informationGlobal sufficient optimality conditions for a special cubic minimization problemGlobal optimality conditions for cubic minimization problem with box or binary constraintsInterior-point methods for nonconvex nonlinear programming: cubic regularizationA Unified Adaptive Tensor Approximation Scheme to Accelerate Composite Convex OptimizationAdaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexityNonsmooth optimization using Taylor-like models: error bounds, convergence, and termination criteriaComplexity bounds for second-order optimality in unconstrained optimizationContracting Proximal Methods for Smooth Convex OptimizationImplementable tensor methods in unconstrained convex optimizationAccelerated Optimization in the PDE Framework Formulations for the Active Contour CaseOn the Consistent Path ProblemSolving Large-Scale Cubic Regularization by a Generalized Eigenvalue ProblemAccelerated Regularized Newton Methods for Minimizing Composite Convex FunctionsSome properties of smooth convex functions and Newton's methodAdaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical resultsRegularized Newton Methods for Minimizing Functions with Hölder Continuous HessiansMinimizing uniformly convex functions by cubic regularization of Newton methodUnveiling the relation between herding and liquidity with trader lead-lag networksComplexity analysis of interior point algorithms for non-Lipschitz and nonconvex minimizationNew subspace minimization conjugate gradient methods based on regularization model for unconstrained optimizationFast and safe: accelerated gradient methods with optimality certificates and underestimate sequencesSeparable cubic modeling and a trust-region strategy for unconstrained minimization with impact in global optimizationTensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order DerivativesLinear Coupling: An Ultimate Unification of Gradient and Mirror DescentAn adaptive nonmonotone trust region method based on a modified scalar approximation of the Hessian in the successive quadratic subproblemsAn adaptive high order method for finding third-order critical points of nonconvex optimizationOn global minimizers of quadratic functions with cubic regularizationGeneralized self-concordant functions: a recipe for Newton-type methodsOracle complexity of second-order methods for smooth convex optimizationUnified Acceleration of High-Order Algorithms under General Hölder ContinuityA control-theoretic perspective on optimal high-order optimizationFinding extremals of Lagrangian actionsAdaptive Hamiltonian Variational Integrators and Applications to Symplectic Accelerated OptimizationOn inexact solution of auxiliary problems in tensor methods for convex optimizationInexact High-Order Proximal-Point Methods with Auxiliary Search ProcedureA Variational Formulation of Accelerated Optimization on Riemannian ManifoldsDeterministic and stochastic primal-dual subgradient algorithms for uniformly convex minimizationHigher-Order Methods for Convex-Concave Min-Max Optimization and Monotone Variational InequalitiesHigh-Order Optimization Methods for Fully Composite ProblemsAn Optimal High-Order Tensor Method for Convex Optimization



Cites Work