A trust region algorithm with adaptive cubic regularization methods for nonsmooth convex minimization
From MaRDI portal
(Redirected from Publication:429459)
Recommendations
- A new trust region algorithm for nonsmooth convex minimization
- Gradient trust region algorithm with limited memory BFGS update for nonsmooth convex minimization
- An effective adaptive trust region algorithm for nonsmooth minimization
- A trust region method for nonsmooth convex optimization
- scientific article; zbMATH DE number 1559140
Cites work
- scientific article; zbMATH DE number 439380 (Why is no real title available?)
- A Globally and Superlinearly Convergent Algorithm for Nonsmooth Convex Minimization
- A Version of the Bundle Idea for Minimizing a Nonsmooth Function: Conceptual Idea, Convergence Analysis, Numerical Results
- A descent algorithm for nonsmooth convex optimization
- A family of variable metric proximal methods
- A general approach to convergence properties of some methods for nonsmooth convex optimization
- A globally convergent Newton method for convex \(SC^ 1\) minimization problems
- A new method for nonsmooth convex optimization
- A new trust region algorithm for nonsmooth convex minimization
- A nonsmooth version of Newton's method
- A trust region algorithm for minimization of locally Lipschitzian functions
- A unified approach to global convergence of trust region methods for nonsmooth optimization
- Accelerating the cubic regularization of Newton's method on convex problems
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- An SQP algorithm for extended linear-quadratic problems in stochastic programming
- Convergence Analysis of Some Algorithms for Solving Nonsmooth Equations
- Convergence analysis of a proximal newton method1
- Convergence analysis of some methods for minimizing a nonsmooth convex function
- Convergence of some algorithms for convex minimization
- Cubic regularization of Newton method and its global performance
- Globally convergent variable metric method for convex nonsmooth unconstrained minimization
- Lagrangian-Dual Functions and Moreau–Yosida Regularization
- On Second-Order Properties of the Moreau–Yosida Regularization for Constrained Nonsmooth Convex Programs
- Practical Aspects of the Moreau--Yosida Regularization: Theoretical Preliminaries
- Proximal level bundle methods for convex nondifferentiable optimization, saddle-point problems and variational inequalities
- Semismoothness of solutions to generalized equations and the Moreau-Yosida regularization
- Trust Region Methods
Cited in
(24)- Cubic-regularization counterpart of a variable-norm trust-region method for unconstrained minimization
- Second-order optimality and beyond: characterization and evaluation complexity in convexly constrained nonlinear optimization
- Cubic regularization in symmetric rank-1 quasi-Newton methods
- Optimality of orders one to three and beyond: characterization and evaluation complexity in constrained nonconvex optimization
- A Newton-like method with mixed factorizations and cubic regularization for unconstrained minimization
- Two modified adaptive cubic regularization algorithms by using the nonmonotone Armijo-type line search
- A trust region method for nonsmooth convex optimization
- A memory gradient method for non-smooth convex optimization
- Gradient trust region algorithm with limited memory BFGS update for nonsmooth convex minimization
- Sketched Newton-Raphson
- Separable cubic modeling and a trust-region strategy for unconstrained minimization with impact in global optimization
- A cubic regularization method for solving nonsmooth equations
- A filter sequential adaptive cubic regularization algorithm for nonlinear constrained optimization
- An ODE-like nonmonotone method for nonsmooth convex optimization
- An effective adaptive trust region algorithm for nonsmooth minimization
- Multivariate spectral gradient algorithm for nonsmooth convex optimization problems
- Derivative-free separable quadratic modeling and cubic regularization for unconstrained optimization
- A new trust region algorithm for nonsmooth convex minimization
- A sequential adaptive regularisation using cubics algorithm for solving nonlinear equality constrained optimization
- Conjugate gradient type methods for the nondifferentiable convex minimization
- A trust region method for finding second-order stationarity in linearly constrained nonconvex optimization
- An efficient conjugate gradient method with strong convergence properties for non-smooth optimization
- Large-scale unconstrained optimization using separable cubic modeling and matrix-free subspace minimization
- A modified scaled memoryless BFGS preconditioned conjugate gradient algorithm for nonsmooth convex optimization
This page was built for publication: A trust region algorithm with adaptive cubic regularization methods for nonsmooth convex minimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q429459)