Mixed integer second-order cone programming formulations for variable selection in linear regression
From MaRDI portal
Publication:320071
DOI10.1016/j.ejor.2015.06.081zbMath1346.90616MaRDI QIDQ320071
Ryuhei Miyashiro, Yuichi Takano
Publication date: 6 October 2016
Published in: European Journal of Operational Research (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.ejor.2015.06.081
integer programming; variable selection; information criterion; multiple linear regression; second-order cone programming
Related Items
Minimization of Akaike's information criterion in linear regression analysis via mixed integer nonlinear program, Feature subset selection for logistic regression via mixed integer optimization, Discrete optimization methods to fit piecewise affine models to data points, Predicting online invitation responses with a competing risk model using privacy-friendly social event data, Locating hyperplanes to fitting set of points: a general framework, DC formulations and algorithms for sparse optimization problems, Alternating direction method of multipliers for truss topology optimization with limited number of nodes: a cardinality-constrained second-order cone programming approach, Feature scaling via second-order cone programming, Mixed integer quadratic optimization formulations for eliminating multicollinearity based on variance inflation factor, Mixed Integer Nonlinear Program for Minimization of Akaike’s Information Criterion, A NEW APPROACH TO SELECT THE BEST SUBSET OF PREDICTORS IN LINEAR REGRESSION MODELLING: BI-OBJECTIVE MIXED INTEGER LINEAR PROGRAMMING
Uses Software
Cites Work
- Using simulated annealing to optimize the feature selection problem in marketing applications
- Choosing the best set of variables in regression analysis using integer programming
- Algorithm for cardinality-constrained quadratic optimization
- Multi-step methods for choosing the best set of variables in regression analysis
- Efficient algorithms for computing the best subset regression models for large-scale problems
- Applications of second-order cone programming
- Selection of relevant features and examples in machine learning
- Wrappers for feature subset selection
- Estimating the dimension of a model
- Robust classification and regression using support vector machines
- An efficient support vector machine learning method with second-order cone programming for large-scale problems
- Regressions by Leaps and Bounds
- A Biometrics Invited Paper. The Analysis and Selection of Variables in Linear Regression
- Further analysis of the data by Akaike's information criterion and the finite corrections
- A New Formula for Predicting the Shrinkage of the Coefficient of Multiple Correlation
- Model Selection and Multimodel Inference
- Regression Model Selection—A Residual Likelihood Approach
- 10.1162/153244303322753616
- Dimension Reduction and Coefficient Estimation in Multivariate Linear Regression
- Computational Methods of Feature Selection
- Ridge Regression: Biased Estimation for Nonorthogonal Problems
- A new look at the statistical model identification
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item