A Mixed-Integer Fractional Optimization Approach to Best Subset Selection
From MaRDI portal
Publication:4995087
DOI10.1287/ijoc.2020.1031OpenAlexW3138025598MaRDI QIDQ4995087
Andrés Gómez, Oleg A. Prokopyev
Publication date: 23 June 2021
Published in: INFORMS Journal on Computing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1287/ijoc.2020.1031
Related Items (6)
Mixed-integer quadratic programming reformulations of multi-task learning models ⋮ Solving a class of feature selection problems via fractional 0--1 programming ⋮ Fractional 0-1 programs: links between mixed-integer linear and conic quadratic formulations ⋮ An efficient optimization approach for best subset selection in linear regression, with application to model selection and fitting in autoregressive time-series ⋮ Strong formulations for conic quadratic optimization with indicator variables ⋮ An effective procedure for feature subset selection in logistic regression based on information criteria
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Best subset selection via a modern optimization lens
- Mixed integer second-order cone programming formulations for variable selection in linear regression
- Statistics for high-dimensional data. Methods, theory and applications.
- Algorithm for cardinality-constrained quadratic optimization
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- An algorithmic framework for convex mixed integer nonlinear programs
- Polymatroids and mean-risk minimization in discrete optimization
- Estimating the dimension of a model
- Strong formulations for quadratic optimization with M-matrices and indicator variables
- Fractional 0-1 programming: applications and algorithms
- Quadratic cone cutting surfaces for quadratic programs with on-off constraints
- Least angle regression. (With discussion)
- Strong formulations for conic quadratic optimization with indicator variables
- Sparse high-dimensional regression: exact scalable algorithms and phase transitions
- Best subset selection via cross-validation criterion
- Simplex QP-based methods for minimizing a conic quadratic objective over polyhedra
- Sparse learning via Boolean relaxations
- Lifted polymatroid inequalities for mean-risk optimization with indicator variables
- Information criteria and statistical modeling.
- Perspective reformulations of mixed integer nonlinear programs with indicator variables
- OR Forum—An Algorithmic Approach to Linear Regression
- FilMINT: An Outer Approximation-Based Solver for Convex Mixed-Integer Nonlinear Programs
- Successive Quadratic Upper-Bounding for Discrete Mean-Risk Minimization and Network Interdiction
- Regression and time series model selection in small samples
- Combinatorial Optimization with Rational Objective Functions
- Regressions by Leaps and Bounds
- Improved Linear Integer Programming Formulations of Nonlinear Integer Problems
- Further analysis of the data by Akaike's information criterion and the finite corrections
- Regression Shrinkage and Selection via The Lasso: A Retrospective
- Minimization of Akaike's information criterion in linear regression analysis via mixed integer nonlinear program
- Applied Linear Regression
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- On the Convexification of Constrained Quadratic Optimization Problems with Indicator Variables
- Scalable Algorithms for the Sparse Ridge Regression
- On Nonlinear Fractional Programming
- A new look at the statistical model identification
This page was built for publication: A Mixed-Integer Fractional Optimization Approach to Best Subset Selection