A trust region algorithm with adaptive cubic regularization methods for nonsmooth convex minimization
From MaRDI portal
Publication:429459
DOI10.1007/s10589-010-9363-1zbMath1268.90048OpenAlexW1966397369MaRDI QIDQ429459
Publication date: 19 June 2012
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-010-9363-1
trust region methodproximal methodMoreau-Yosida regularizationnonsmooth convex minimizationcubic overestimation model
Related Items (17)
An ODE-like nonmonotone method for nonsmooth convex optimization ⋮ Multivariate spectral gradient algorithm for nonsmooth convex optimization problems ⋮ Cubic-regularization counterpart of a variable-norm trust-region method for unconstrained minimization ⋮ Sketched Newton--Raphson ⋮ A Newton-like method with mixed factorizations and cubic regularization for unconstrained minimization ⋮ An efficient conjugate gradient method with strong convergence properties for non-smooth optimization ⋮ A filter sequential adaptive cubic regularization algorithm for nonlinear constrained optimization ⋮ Conjugate gradient type methods for the nondifferentiable convex minimization ⋮ A sequential adaptive regularisation using cubics algorithm for solving nonlinear equality constrained optimization ⋮ A Trust Region Method for Finding Second-Order Stationarity in Linearly Constrained Nonconvex Optimization ⋮ A modified scaled memoryless BFGS preconditioned conjugate gradient algorithm for nonsmooth convex optimization ⋮ Cubic regularization in symmetric rank-1 quasi-Newton methods ⋮ Optimality of orders one to three and beyond: characterization and evaluation complexity in constrained nonconvex optimization ⋮ Second-order optimality and beyond: characterization and evaluation complexity in convexly constrained nonlinear optimization ⋮ Large-scale unconstrained optimization using separable cubic modeling and matrix-free subspace minimization ⋮ Separable cubic modeling and a trust-region strategy for unconstrained minimization with impact in global optimization ⋮ A memory gradient method for non-smooth convex optimization
Cites Work
- Unnamed Item
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- A new trust region algorithm for nonsmooth convex minimization
- Accelerating the cubic regularization of Newton's method on convex problems
- Convergence analysis of some methods for minimizing a nonsmooth convex function
- A general approach to convergence properties of some methods for nonsmooth convex optimization
- Convergence of some algorithms for convex minimization
- A trust region algorithm for minimization of locally Lipschitzian functions
- A new method for nonsmooth convex optimization
- A family of variable metric proximal methods
- Globally convergent variable metric method for convex nonsmooth unconstrained minimization
- A unified approach to global convergence of trust region methods for nonsmooth optimization
- An SQP algorithm for extended linear-quadratic problems in stochastic programming
- A globally convergent Newton method for convex \(SC^ 1\) minimization problems
- Proximal level bundle methods for convex nondifferentiable optimization, saddle-point problems and variational inequalities
- A nonsmooth version of Newton's method
- Cubic regularization of Newton method and its global performance
- Semismoothness of solutions to generalized equations and the Moreau-Yosida regularization
- A descent algorithm for nonsmooth convex optimization
- Lagrangian-Dual Functions and Moreau–Yosida Regularization
- A Version of the Bundle Idea for Minimizing a Nonsmooth Function: Conceptual Idea, Convergence Analysis, Numerical Results
- Practical Aspects of the Moreau--Yosida Regularization: Theoretical Preliminaries
- Trust Region Methods
- On Second-Order Properties of the Moreau–Yosida Regularization for Constrained Nonsmooth Convex Programs
- Convergence Analysis of Some Algorithms for Solving Nonsmooth Equations
- Convergence analysis of a proximal newton method1
- A Globally and Superlinearly Convergent Algorithm for Nonsmooth Convex Minimization
This page was built for publication: A trust region algorithm with adaptive cubic regularization methods for nonsmooth convex minimization