Sparse regression at scale: branch-and-bound rooted in first-order optimization
From MaRDI portal
Publication:2097642
DOI10.1007/s10107-021-01712-4zbMath1506.90168arXiv2004.06152OpenAlexW3205533488MaRDI QIDQ2097642
Rahul Mazumder, Hussein Hazimeh, Ali Saab
Publication date: 14 November 2022
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2004.06152
branch and boundmixed integer programmingsparsity\(\ell_0\) regularizationfirst-order methodslarge scale computation
Mixed integer programming (90C11) Polyhedral combinatorics, branch-and-bound, branch-and-cut (90C57) Quadratic programming (90C20)
Related Items
L0BnB, Global optimization for sparse solution of least squares problems, Grouped variable selection with discrete optimization: computational and statistical perspectives, A new perspective on low-rank optimization, Learning sparse nonlinear dynamics via mixed-integer optimization, A graph-based decomposition method for convex quadratic optimization with indicators, Supermodularity and valid inequalities for quadratic optimization with indicators, Subset Selection and the Cone of Factor-Width-k Matrices, Special issue: Global solution of integer, stochastic and nonconvex optimization problems
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Best subset selection via a modern optimization lens
- Gradient methods for minimizing composite functions
- Iteration complexity analysis of block coordinate descent methods
- Iterative hard thresholding for compressed sensing
- A strong conic quadratic reformulation for machine-job assignment with controllable processing times
- Best subset selection, persistence in high-dimensional statistical learning and optimization under \(l_1\) constraint
- On the solution of traveling salesman problems
- On implementing a primal-dual interior-point method for conic quadratic optimization
- Extended formulations in mixed integer conic quadratic programming
- A polyhedral branch-and-cut approach to global optimization
- Branching rules revisited
- Sparse classification: a scalable discrete optimization perspective
- Sparse high-dimensional regression: exact scalable algorithms and phase transitions
- Sparse regression: scalable algorithms and empirical performance
- Best subset, forward stepwise or Lasso? Analysis and recommendations based on extensive comparisons
- Discussion of ``Best subset, forward stepwise or Lasso? Analysis and recommendations based on extensive comparisons
- Sparse learning via Boolean relaxations
- Perspective cuts for a class of convex 0-1 mixed integer programs
- Perspective reformulations of mixed integer nonlinear programs with indicator variables
- Sparsity Constrained Nonlinear Optimization: Optimality Conditions and Algorithms
- A Lifted Linear Programming Branch-and-Bound Algorithm for Mixed-Integer Conic Quadratic Programs
- SparseNet: Coordinate Descent With Nonconvex Penalties
- Extended Formulations in Mixed-Integer Convex Programming
- An outer-approximation algorithm for a class of mixed-integer nonlinear programs
- Sparse Approximate Solutions to Linear Systems
- Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting
- Necessary and Sufficient Conditions for Sparsity Pattern Recovery
- Fast Best Subset Selection: Coordinate Descent and Local Combinatorial Optimization Algorithms
- Scalable Algorithms for the Sparse Ridge Regression
- Minimax Rates of Estimation for High-Dimensional Linear Regression Over $\ell_q$-Balls
- Mixed-integer nonlinear optimization
- A tree-search algorithm for mixed integer programming problems
- Convergence of a block coordinate descent method for nondifferentiable minimization