The Discrete Dantzig Selector: Estimating Sparse Linear Models via Mixed Integer Linear Optimization
From MaRDI portal
Publication:5347971
zbMATH Open1368.94035arXiv1508.01922MaRDI QIDQ5347971FDOQ5347971
Authors: Rahul Mazumder, Peter Radchenko
Publication date: 25 August 2017
Abstract: We propose a novel high-dimensional linear regression estimator: the Discrete Dantzig Selector, which minimizes the number of nonzero regression coefficients subject to a budget on the maximal absolute correlation between the features and residuals. Motivated by the significant advances in integer optimization over the past 10-15 years, we present a Mixed Integer Linear Optimization (MILO) approach to obtain certifiably optimal global solutions to this nonconvex optimization problem. The current state of algorithmics in integer optimization makes our proposal substantially more computationally attractive than the least squares subset selection framework based on integer quadratic optimization, recently proposed in [8] and the continuous nonconvex quadratic optimization framework of [33]. We propose new discrete first-order methods, which when paired with state-of-the-art MILO solvers, lead to good solutions for the Discrete Dantzig Selector problem for a given computational budget. We illustrate that our integrated approach provides globally optimal solutions in significantly shorter computation times, when compared to off-the-shelf MILO solvers. We demonstrate both theoretically and empirically that in a wide range of regimes the statistical properties of the Discrete Dantzig Selector are superior to those of popular -based approaches. We illustrate that our approach can handle problem instances with p = 10,000 features with certifiable optimality making it a highly scalable combinatorial variable selection approach in sparse linear modeling.
Full work available at URL: https://arxiv.org/abs/1508.01922
Signal theory (characterization, reconstruction, filtering, etc.) (94A12) Mixed integer programming (90C11)
Cited In (12)
- Fast best subset selection: coordinate descent and local combinatorial optimization algorithms
- Using \(\ell_1\)-relaxation and integer programming to obtain dual bounds for sparse PCA
- The trimmed Lasso: sparse recovery guarantees and practical optimization by the generalized soft-min penalty
- Cardinality minimization, constraints, and regularization: a survey
- Scalable algorithms for the sparse ridge regression
- Grouped variable selection with discrete optimization: computational and statistical perspectives
- A First-Order Optimization Algorithm for Statistical Learning with Hierarchical Sparsity Structure
- Sparse estimation via lower-order penalty optimization methods in high-dimensional linear regression
- Mixed integer quadratic optimization formulations for eliminating multicollinearity based on variance inflation factor
- Graph structured sparse subset selection
- Matrix completion with nonconvex regularization: spectral operators and scalable algorithms
- Robust subset selection
This page was built for publication: The Discrete Dantzig Selector: Estimating Sparse Linear Models via Mixed Integer Linear Optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5347971)