Complexity of Partially Separable Convexly Constrained Optimization with Non-Lipschitzian Singularities
From MaRDI portal
Publication:4634141
DOI10.1137/18M1166511zbMath1411.90318arXiv1704.06919OpenAlexW2925648101WikidataQ128145683 ScholiaQ128145683MaRDI QIDQ4634141
Xiaojun Chen, Hong Wang, Phillipe L. Toint
Publication date: 7 May 2019
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1704.06919
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Optimality conditions and duality in mathematical programming (90C46)
Related Items
Tensor methods for finding approximate stationary points of convex functions, An interior stochastic gradient method for a class of non-Lipschitz optimization problems, The evaluation complexity of finding high-order minimizers of nonconvex optimization, A Unified Adaptive Tensor Approximation Scheme to Accelerate Composite Convex Optimization, Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy, On High-Order Multilevel Optimization Strategies, High-order evaluation complexity for convexly-constrained optimization with non-Lipschitzian group sparsity terms, Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives, An adaptive high order method for finding third-order critical points of nonconvex optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- A Modeling Language for Mathematical Programming
- A smoothing SQP framework for a class of composite \(L_q\) minimization over polyhedron
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- Asymptotic properties of bridge estimators in sparse high-dimensional regression models
- Complexity of unconstrained \(L_2 - L_p\) minimization
- Cubic regularization of Newton method and its global performance
- Complexity analysis of interior point algorithms for non-Lipschitz and nonconvex minimization
- Optimality Conditions and a Smoothing Trust Region Newton Method for NonLipschitz Optimization
- Lower Bound Theory of Nonzero Entries in Solutions of $\ell_2$-$\ell_p$ Minimization
- On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems
- Distributed Block Coordinate Descent for Minimizing Partially Separable Functions
- Trust Region Methods
- On High-order Model Regularization for Constrained Optimization
- Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy
- An adaptive cubic regularization algorithm for nonconvex optimization with convex constraints and its function-evaluation complexity
- Convergence Properties of Minimization Algorithms for Convex Constraints Using a Structured Trust Region
- Partial-Update Newton Methods for Unary, Factorable, and Partially Separable Optimization
- Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians