A trust region algorithm with adaptive cubic regularization methods for nonsmooth convex minimization
From MaRDI portal
Publication:429459
DOI10.1007/S10589-010-9363-1zbMATH Open1268.90048OpenAlexW1966397369MaRDI QIDQ429459FDOQ429459
Authors: Sha Lu, Lue Li, Zengxin Wei
Publication date: 19 June 2012
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-010-9363-1
Recommendations
- A new trust region algorithm for nonsmooth convex minimization
- Gradient trust region algorithm with limited memory BFGS update for nonsmooth convex minimization
- An effective adaptive trust region algorithm for nonsmooth minimization
- A trust region method for nonsmooth convex optimization
- scientific article; zbMATH DE number 1559140
proximal methodtrust region methodMoreau-Yosida regularizationnonsmooth convex minimizationcubic overestimation model
Cites Work
- Convergence analysis of some methods for minimizing a nonsmooth convex function
- A general approach to convergence properties of some methods for nonsmooth convex optimization
- Convergence of some algorithms for convex minimization
- A new method for nonsmooth convex optimization
- A family of variable metric proximal methods
- A unified approach to global convergence of trust region methods for nonsmooth optimization
- Proximal level bundle methods for convex nondifferentiable optimization, saddle-point problems and variational inequalities
- A nonsmooth version of Newton's method
- A descent algorithm for nonsmooth convex optimization
- A Version of the Bundle Idea for Minimizing a Nonsmooth Function: Conceptual Idea, Convergence Analysis, Numerical Results
- Trust Region Methods
- Convergence Analysis of Some Algorithms for Solving Nonsmooth Equations
- Convergence analysis of a proximal newton method1
- A Globally and Superlinearly Convergent Algorithm for Nonsmooth Convex Minimization
- Globally convergent variable metric method for convex nonsmooth unconstrained minimization
- Title not available (Why is that?)
- Semismoothness of solutions to generalized equations and the Moreau-Yosida regularization
- Practical Aspects of the Moreau--Yosida Regularization: Theoretical Preliminaries
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Lagrangian-Dual Functions and Moreau–Yosida Regularization
- A new trust region algorithm for nonsmooth convex minimization
- Cubic regularization of Newton method and its global performance
- Accelerating the cubic regularization of Newton's method on convex problems
- An SQP algorithm for extended linear-quadratic problems in stochastic programming
- A trust region algorithm for minimization of locally Lipschitzian functions
- A globally convergent Newton method for convex \(SC^ 1\) minimization problems
- On Second-Order Properties of the Moreau–Yosida Regularization for Constrained Nonsmooth Convex Programs
Cited In (21)
- Cubic-regularization counterpart of a variable-norm trust-region method for unconstrained minimization
- Sketched Newton--Raphson
- Second-order optimality and beyond: characterization and evaluation complexity in convexly constrained nonlinear optimization
- Cubic regularization in symmetric rank-1 quasi-Newton methods
- Optimality of orders one to three and beyond: characterization and evaluation complexity in constrained nonconvex optimization
- A Newton-like method with mixed factorizations and cubic regularization for unconstrained minimization
- Two modified adaptive cubic regularization algorithms by using the nonmonotone Armijo-type line search
- A memory gradient method for non-smooth convex optimization
- A trust region method for nonsmooth convex optimization
- A filter sequential adaptive cubic regularization algorithm for nonlinear constrained optimization
- Separable cubic modeling and a trust-region strategy for unconstrained minimization with impact in global optimization
- A Trust Region Method for Finding Second-Order Stationarity in Linearly Constrained Nonconvex Optimization
- An ODE-like nonmonotone method for nonsmooth convex optimization
- Multivariate spectral gradient algorithm for nonsmooth convex optimization problems
- Derivative-free separable quadratic modeling and cubic regularization for unconstrained optimization
- A new trust region algorithm for nonsmooth convex minimization
- A sequential adaptive regularisation using cubics algorithm for solving nonlinear equality constrained optimization
- Conjugate gradient type methods for the nondifferentiable convex minimization
- An efficient conjugate gradient method with strong convergence properties for non-smooth optimization
- Large-scale unconstrained optimization using separable cubic modeling and matrix-free subspace minimization
- A modified scaled memoryless BFGS preconditioned conjugate gradient algorithm for nonsmooth convex optimization
This page was built for publication: A trust region algorithm with adaptive cubic regularization methods for nonsmooth convex minimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q429459)