High-order evaluation complexity for convexly-constrained optimization with non-Lipschitzian group sparsity terms
DOI10.1007/S10107-020-01470-9zbMATH Open1465.90095arXiv1902.10767OpenAlexW3000514161MaRDI QIDQ2020600FDOQ2020600
Philippe L. Toint, Xiaojun Chen
Publication date: 23 April 2021
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1902.10767
Recommendations
- Sparse optimization for nonconvex group penalized estimation
- Sharp worst-case evaluation complexity bounds for arbitrary-order nonconvex optimization with inexpensive constraints
- Sparsity in higher order methods for unconstrained optimization
- Optimality and complexity for constrained optimization problems with nonconvex regularization
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- On complexity and convergence of high-order coordinate descent algorithms for smooth nonconvex box-constrained minimization
- Difference-of-Convex Algorithms for a Class of Sparse Group $\ell_0$ Regularized Optimization Problems
- Evaluation complexity of algorithms for nonconvex optimization. Theory, computation and perspectives
- Complexity of Partially Separable Convexly Constrained Optimization with Non-Lipschitzian Singularities
- Optimality of orders one to three and beyond: characterization and evaluation complexity in constrained nonconvex optimization
nonlinear optimizationcomplexity theorygroup sparsityisotropic modelnon-Lipschitz functionspartially-separable problems
Numerical mathematical programming methods (65K05) Optimality conditions and duality in mathematical programming (90C46) Nonlinear programming (90C30)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- Model Selection and Estimation in Regression with Grouped Variables
- A Modeling Language for Mathematical Programming
- Accuracy guaranties for \(\ell_{1}\) recovery of block-sparse signals
- A group bridge approach for variable selection
- The benefit of group sparsity
- Lower Bound Theory of Nonzero Entries in Solutions of $\ell_2$-$\ell_p$ Minimization
- Trust Region Methods
- Complexity of unconstrained \(L_2 - L_p\) minimization
- Isotropic sparse regularization for spherical harmonic representations of random fields on the sphere
- Block-Sparse Signals: Uncertainty Relations and Efficient Recovery
- Complexity analysis of interior point algorithms for non-Lipschitz and nonconvex minimization
- Optimality conditions and a smoothing trust region Newton method for nonlipschitz optimization
- Support union recovery in high-dimensional multivariate regression
- Worst-case complexity of smoothing quadratic regularization methods for non-Lipschitzian optimization
- Subspace Methods for Joint Sparse Recovery
- An adaptive cubic regularization algorithm for nonconvex optimization with convex constraints and its function-evaluation complexity
- Convergence Properties of Minimization Algorithms for Convex Constraints Using a Structured Trust Region
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- Second-order optimality and beyond: characterization and evaluation complexity in convexly constrained nonlinear optimization
- WORST-CASE EVALUATION COMPLEXITY AND OPTIMALITY OF SECOND-ORDER METHODS FOR NONCONVEX SMOOTH OPTIMIZATION
- Group-Sparse Model Selection: Hardness and Relaxations
- Error bounds for compressed sensing algorithms with group sparsity: A unified approach
- Sparse optimization for nonconvex group penalized estimation
- Group descent algorithms for nonconvex penalized linear and logistic regression models with grouped predictors
- Modified partial-update Newton-type algorithms for unary optimization
- Partial-Update Newton Methods for Unary, Factorable, and Partially Separable Optimization
- Distributed Block Coordinate Descent for Minimizing Partially Separable Functions
- The Group Lasso for Stable Recovery of Block-Sparse Signal Representations
- Optimization problems involving group sparsity terms
- Complexity of Partially Separable Convexly Constrained Optimization with Non-Lipschitzian Singularities
- Spherical Designs and Nonconvex Minimization for Recovery of Sparse Signals on the Sphere
- Sharp Worst-Case Evaluation Complexity Bounds for Arbitrary-Order Nonconvex Optimization with Inexpensive Constraints
- Adaptive Regularization Algorithms with Inexact Evaluations for Nonconvex Optimization
Cited In (7)
- Complexity of finite-sum optimization with nonsmooth composite functions and non-Lipschitz regularization
- A Unified Adaptive Tensor Approximation Scheme to Accelerate Composite Convex Optimization
- An interior stochastic gradient method for a class of non-Lipschitz optimization problems
- The evaluation complexity of finding high-order minimizers of nonconvex optimization
- Title not available (Why is that?)
- An adaptive high order method for finding third-order critical points of nonconvex optimization
- Tensor methods for finding approximate stationary points of convex functions
Uses Software
This page was built for publication: High-order evaluation complexity for convexly-constrained optimization with non-Lipschitzian group sparsity terms
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2020600)