New bounds for subset selection from conic relaxations
From MaRDI portal
Publication:2076815
DOI10.1016/j.ejor.2021.07.011zbMath1490.90236OpenAlexW3179773328MaRDI QIDQ2076815
Publication date: 22 February 2022
Published in: European Journal of Operational Research (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.ejor.2021.07.011
Ridge regression; shrinkage estimators (Lasso) (62J07) Linear regression; mixed models (62J05) Combinatorial optimization (90C27)
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- Best subset selection via a modern optimization lens
- Mixed integer second-order cone programming formulations for variable selection in linear regression
- On constrained and regularized high-dimensional regression
- Least angle regression. (With discussion)
- Best subset, forward stepwise or Lasso? Analysis and recommendations based on extensive comparisons
- SparseNet: Coordinate Descent With Nonconvex Penalties
- Just relax: convex programming methods for identifying sparse signals in noise
- Learning Multiscale Sparse Representations for Image and Video Restoration
- Regressions by Leaps and Bounds
- A Branch and Bound Algorithm for Feature Subset Selection
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Sparse Approximate Solutions to Linear Systems
- Harmonic decomposition of audio signals with matching pursuit
- A Direct Formulation for Sparse PCA Using Semidefinite Programming
This page was built for publication: New bounds for subset selection from conic relaxations