Pages that link to "Item:Q995787"
From MaRDI portal
The following pages link to Accelerating the cubic regularization of Newton's method on convex problems (Q995787):
Displaying 50 items.
- Gradient methods for minimizing composite functions (Q359630) (← links)
- Global complexity bound analysis of the Levenberg-Marquardt method for nonsmooth equations and its application to the nonlinear complementarity problem (Q415364) (← links)
- A trust region algorithm with adaptive cubic regularization methods for nonsmooth convex minimization (Q429459) (← links)
- Interior-point methods for nonconvex nonlinear programming: cubic regularization (Q457205) (← links)
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results (Q535013) (← links)
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity (Q652287) (← links)
- Complexity bounds for second-order optimality in unconstrained optimization (Q657654) (← links)
- Separable cubic modeling and a trust-region strategy for unconstrained minimization with impact in global optimization (Q746819) (← links)
- Global optimality conditions for cubic minimization problems with cubic constraints (Q907371) (← links)
- Cubic-regularization counterpart of a variable-norm trust-region method for unconstrained minimization (Q1675558) (← links)
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods (Q1942265) (← links)
- Global sufficient optimality conditions for a special cubic minimization problem (Q1955308) (← links)
- Global optimality conditions for cubic minimization problem with box or binary constraints (Q1959231) (← links)
- Minimizing uniformly convex functions by cubic regularization of Newton method (Q2032037) (← links)
- New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization (Q2041515) (← links)
- Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences (Q2044479) (← links)
- An adaptive high order method for finding third-order critical points of nonconvex optimization (Q2079692) (← links)
- A control-theoretic perspective on optimal high-order optimization (Q2089793) (← links)
- Finding extremals of Lagrangian actions (Q2094445) (← links)
- Local convergence of tensor methods (Q2133417) (← links)
- Cubic regularized Newton method for the saddle point models: a global and local convergence analysis (Q2148117) (← links)
- Finding geodesics joining given points (Q2165015) (← links)
- Smoothness parameter of power of Euclidean norm (Q2178876) (← links)
- Newton-type methods for non-convex optimization under inexact Hessian information (Q2205970) (← links)
- Nonsmooth optimization using Taylor-like models: error bounds, convergence, and termination criteria (Q2220664) (← links)
- Implementable tensor methods in unconstrained convex optimization (Q2227532) (← links)
- Some properties of smooth convex functions and Newton's method (Q2243857) (← links)
- On global minimizers of quadratic functions with cubic regularization (Q2329649) (← links)
- Generalized self-concordant functions: a recipe for Newton-type methods (Q2330645) (← links)
- Oracle complexity of second-order methods for smooth convex optimization (Q2330652) (← links)
- Complexity analysis of interior point algorithms for non-Lipschitz and nonconvex minimization (Q2515043) (← links)
- Superfast second-order methods for unconstrained convex optimization (Q2664892) (← links)
- Inexact accelerated high-order proximal-point methods (Q2689812) (← links)
- Evaluation complexity of adaptive cubic regularization methods for convex unconstrained optimization (Q2885470) (← links)
- Deterministic and stochastic primal-dual subgradient algorithms for uniformly convex minimization (Q2921184) (← links)
- A Hybrid Proximal Extragradient Self-Concordant Primal Barrier Method for Monotone Variational Inequalities (Q3449570) (← links)
- Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent (Q4638051) (← links)
- Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions (Q4646444) (← links)
- Unveiling the relation between herding and liquidity with trader lead-lag networks (Q4957237) (← links)
- (Q4969259) (← links)
- Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives (Q4971023) (← links)
- Accelerated Extra-Gradient Descent: A Novel Accelerated First-Order Method (Q4993286) (← links)
- (Q4999082) (← links)
- Unified Acceleration of High-Order Algorithms under General Hölder Continuity (Q5003214) (← links)
- Adaptive Hamiltonian Variational Integrators and Applications to Symplectic Accelerated Optimization (Q5010240) (← links)
- Inexact High-Order Proximal-Point Methods with Auxiliary Search Procedure (Q5013580) (← links)
- An Optimal High-Order Tensor Method for Convex Optimization (Q5026443) (← links)
- An improvement of adaptive cubic regularization method for unconstrained optimization problems (Q5031220) (← links)
- Tensor methods for finding approximate stationary points of convex functions (Q5038435) (← links)
- Inexact basic tensor methods for some classes of convex optimization problems (Q5043845) (← links)