Pages that link to "Item:Q4629334"
From MaRDI portal
The following pages link to Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy (Q4629334):
Displaying 31 items.
- On large-scale unconstrained optimization and arbitrary regularization (Q2070329) (← links)
- A control-theoretic perspective on optimal high-order optimization (Q2089793) (← links)
- On complexity and convergence of high-order coordinate descent algorithms for smooth nonconvex box-constrained minimization (Q2089862) (← links)
- Implementable tensor methods in unconstrained convex optimization (Q2227532) (← links)
- Convergence and evaluation-complexity analysis of a regularized tensor-Newton method for solving nonlinear least-squares problems (Q2419539) (← links)
- A Newton-like method with mixed factorizations and cubic regularization for unconstrained minimization (Q2419564) (← links)
- A sequential adaptive regularisation using cubics algorithm for solving nonlinear equality constrained optimization (Q2696932) (← links)
- The impact of noise on evaluation complexity: the deterministic trust-region case (Q2696963) (← links)
- Complexity of Partially Separable Convexly Constrained Optimization with Non-Lipschitzian Singularities (Q4634141) (← links)
- Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions (Q4646444) (← links)
- Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives (Q4971023) (← links)
- Unified Acceleration of High-Order Algorithms under General Hölder Continuity (Q5003214) (← links)
- An Optimal High-Order Tensor Method for Convex Optimization (Q5026443) (← links)
- On high-order model regularization for multiobjective optimization (Q5038176) (← links)
- On the complexity of solving feasibility problems with regularized models (Q5038424) (← links)
- Tensor methods for finding approximate stationary points of convex functions (Q5038435) (← links)
- Inexact basic tensor methods for some classes of convex optimization problems (Q5043845) (← links)
- A Unified Adaptive Tensor Approximation Scheme to Accelerate Composite Convex Optimization (Q5131958) (← links)
- On the Complexity of an Inexact Restoration Method for Constrained Optimization (Q5210514) (← links)
- Sharp Worst-Case Evaluation Complexity Bounds for Arbitrary-Order Nonconvex Optimization with Inexpensive Constraints (Q5217594) (← links)
- Iteration and evaluation complexity for the minimization of functions whose computation is intrinsically inexact (Q5235099) (← links)
- On inexact solution of auxiliary problems in tensor methods for convex optimization (Q5859013) (← links)
- A filter sequential adaptive cubic regularization algorithm for nonlinear constrained optimization (Q6109884) (← links)
- Hyperfast second-order local solvers for efficient statistically preconditioned distributed optimization (Q6114954) (← links)
- Regularized Newton Method with Global \({\boldsymbol{\mathcal{O}(1/{k}^2)}}\) Convergence (Q6116237) (← links)
- Adaptive Third-Order Methods for Composite Convex Optimization (Q6171322) (← links)
- Inexact tensor methods and their application to stochastic convex optimization (Q6585820) (← links)
- Complexity of a projected Newton-CG method for optimization with bounds (Q6608031) (← links)
- Higher-order Newton methods with polynomial work per iteration (Q6608710) (← links)
- Hessian barrier algorithms for non-convex conic optimization (Q6665383) (← links)
- Perseus: a simple and optimal high-order method for variational inequalities (Q6665392) (← links)