Minimization Over the Nonconvex Sparsity Constraint Using A Hybrid First-order method
From MaRDI portal
Publication:6504445
arXiv2104.04400MaRDI QIDQ6504445FDOQ6504445
Authors: Xiangyu Yang, Hao Wang, Yichen Zhu, Xiao Wang
Abstract: In this paper, we consider the nonconvex optimization problem in which the objective is continuously differentiable and the feasible set is a nonconvex ball. Studying such optimization offers the possibility to bridge the gap between a range of intermediate values and in the norm-constraint sparse optimization, both in theory and practice. We propose a novel hybrid method within a first-order algorithmic framework for solving such problems by combining the Frank-Wolfe method and the gradient projection method. During iterations, it solves a Frank-Wolfe subproblem if the current iterate is in the interior of the ball, and it solves a gradient projection subproblem with a weighted -ball constraint if the current iterate is on the boundary of the ball. Global convergence is proved, and a worst-case iteration complexity of the optimality error is also established. We believe our proposed method is the first practical algorithm for solving -ball constrained nonlinear problems with theoretical guarantees. Numerical experiments demonstrate the practicability and efficiency of the proposed algorithm.
Has companion code repository: https://github.com/optimizater/hybrid-1st-order-algorithm
Numerical mathematical programming methods (65K05) Analysis of algorithms and problem complexity (68Q25) Nonconvex programming, global optimization (90C26) Nonsmooth analysis (49J52)
This page was built for publication: Minimization Over the Nonconvex Sparsity Constraint Using A Hybrid First-order method
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6504445)