Accelerated regularized Newton methods for minimizing composite convex functions
From MaRDI portal
Publication:4646444
Recommendations
- Regularized Newton methods for minimizing functions with Hölder continuous hessians
- Minimizing uniformly convex functions by cubic regularization of Newton method
- Regularized Newton methods for convex minimization problems with singular solutions
- Accelerating the cubic regularization of Newton's method on convex problems
- Gradient methods for minimizing composite functions
Cites work
- A trust region algorithm with a worst-case iteration complexity of \(\mathcal{O}(\epsilon ^{-3/2})\) for nonconvex optimization
- ARC\(_q\): a new adaptive regularization by cubics
- Accelerating the cubic regularization of Newton's method on convex problems
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Cubic regularization of Newton method and its global performance
- Cubic-regularization counterpart of a variable-norm trust-region method for unconstrained minimization
- Gradient methods for minimizing composite functions
- On High-order Model Regularization for Constrained Optimization
- On the use of the energy norm in trust-region and adaptive cubic regularization subproblems
- Regularized Newton methods for minimizing functions with Hölder continuous hessians
- Smooth minimization of non-smooth functions
- The use of quadratic regularization with a cubic descent condition for unconstrained optimization
- Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy
Cited in
(23)- Super-Universal Regularized Newton Method
- On inexact solution of auxiliary problems in tensor methods for convex optimization
- Contracting proximal methods for smooth convex optimization
- Variants of the A-HPE and large-step A-HPE algorithms for strongly convex problems with applications to accelerated high-order tensor methods
- A Regularized Newton Method for \({\boldsymbol{\ell}}_{q}\) -Norm Composite Optimization Problems
- Local convergence of tensor methods
- Set-limited functions and polynomial-time interior-point methods
- Adaptive Third-Order Methods for Composite Convex Optimization
- scientific article; zbMATH DE number 7449564 (Why is no real title available?)
- Minimizing uniformly convex functions by cubic regularization of Newton method
- Smoothness parameter of power of Euclidean norm
- An adaptive regularized proximal Newton-type methods for composite optimization over the Stiefel manifold
- Tensor methods for minimizing convex functions with Hölder continuous higher-order derivatives
- A control-theoretic perspective on optimal high-order optimization
- Near-optimal hyperfast second-order method for convex optimization
- A unified adaptive tensor approximation scheme to accelerate composite convex optimization
- Perseus: a simple and optimal high-order method for variational inequalities
- Regularized Newton methods for minimizing functions with Hölder continuous hessians
- Unified acceleration of high-order algorithms under general Hölder continuity
- Tensor methods for finding approximate stationary points of convex functions
- A Subspace Acceleration Method for Minimization Involving a Group Sparsity-Inducing Regularizer
- An accelerated regularized Chebyshev-Halley method for unconstrained optimization
- On the complexity of solving feasibility problems with regularized models
This page was built for publication: Accelerated regularized Newton methods for minimizing composite convex functions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4646444)