Coordinate Descent for SLOPE
From MaRDI portal
Publication:6415212
arXiv2210.14780MaRDI QIDQ6415212FDOQ6415212
Authors: Johan Larsson, Quentin Klopfenstein, Mathurin Massias, Jonas Wallin
Publication date: 26 October 2022
Abstract: The lasso is the most famous sparse regression and feature selection method. One reason for its popularity is the speed at which the underlying optimization problem can be solved. Sorted L-One Penalized Estimation (SLOPE) is a generalization of the lasso with appealing statistical properties. In spite of this, the method has not yet reached widespread interest. A major reason for this is that current software packages that fit SLOPE rely on algorithms that perform poorly in high dimensions. To tackle this issue, we propose a new fast algorithm to solve the SLOPE optimization problem, which combines proximal gradient descent and proximal coordinate descent steps. We provide new results on the directional derivative of the SLOPE penalty and its related SLOPE thresholding operator, as well as provide convergence guarantees for our proposed solver. In extensive benchmarks on simulated and real data, we show that our method outperforms a long list of competing algorithms.
Has companion code repository: https://github.com/jolars/slopecd
This page was built for publication: Coordinate Descent for SLOPE
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6415212)