On Nesterov's nonsmooth Chebyshev-Rosenbrock functions
From MaRDI portal
Publication:654068
DOI10.1016/j.na.2011.07.062zbMath1269.49018OpenAlexW2146018083MaRDI QIDQ654068
Mert Gürbüzbalaban, Michael L. Overton
Publication date: 21 December 2011
Published in: Nonlinear Analysis. Theory, Methods \& Applications. Series A: Theory and Methods (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.na.2011.07.062
Related Items
On Lipschitz optimization based on gray-box piecewise linearization, Two globally convergent nonmonotone trust-region methods for unconstrained optimization, A note about the complexity of minimizing Nesterov's smooth Chebyshev–Rosenbrock function, Nonsmooth optimization via quasi-Newton methods, A limited-memory quasi-Newton algorithm for bound-constrained non-smooth optimization, An accelerated nonmonotone trust region method with adaptive trust region for unconstrained optimization, Convergence of non-smooth descent methods using the Kurdyka-Łojasiewicz inequality, Enumeration of subdifferentials of piecewise linear functions with abs-normal form, Finite convergence of an active signature method to local minima of piecewise linear functions, Worst case complexity of direct search, First- and second-order optimality conditions for piecewise smooth objective functions, Characterizing and Testing Subdifferential Regularity in Piecewise Smooth Optimization, An algorithm for nonsmooth optimization by successive piecewise linearization, A geometric integration approach to nonsmooth, nonconvex optimisation, Nonsmoothness and a variable metric method
Uses Software
Cites Work
- Nonsmooth optimization via quasi-Newton methods
- Maximum principle in the problem of time optimal response with nonsmooth constraints
- The BFGS method with exact line searches fails for non-convex objective functions
- Methods of descent for nondifferentiable optimization
- Optimization theory and methods. Nonlinear programming
- On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
- Variational Analysis
- Convergence Properties of the BFGS Algoritm
- Active Sets, Nonsmoothness, and Sensitivity
- A Robust Gradient Sampling Algorithm for Nonsmooth, Nonconvex Optimization
- Approximating Subdifferentials by Random Sampling of Gradients
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item