A multistage algorithm for best-subset model selection based on the Kullback-Leibler discrepancy
From MaRDI portal
Publication:736651
DOI10.1007/s00180-015-0584-8zbMath1342.65077OpenAlexW2069101646MaRDI QIDQ736651
Joseph E. Cavanaugh, Tao Zhang
Publication date: 4 August 2016
Published in: Computational Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s00180-015-0584-8
Computational methods for problems pertaining to statistics (62-08) Time series, auto-correlation, regression, etc. in statistics (GARCH) (62M10)
Related Items (2)
Extending AIC to best subset regression ⋮ Best-subset model selection based on multitudinal assessments of likelihood improvements
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- An improved Akaike information criterion for state-space model selection
- The multivariate inclusion-exclusion formula and order statistics from dependent variates
- Estimating the dimension of a model
- Estimating the Error Rate of a Prediction Rule: Improvement on Cross-Validation
- How Biased is the Apparent Error Rate of a Prediction Rule?
- Regression and time series model selection in small samples
- An optimal selection of regression variables
- Further analysis of the data by Akaike's information criterion and the finite corrections
- The Covariance Inflation Criterion for Adaptive Model Selection
- Modified AIC and Cp in multivariate linear regression
- Statistical Models
- Criteria for Linear Model Selection Based on Kullback's Symmetric Divergence
- Model Selection for Multivariate Regression in Small Samples
- A CORRECTED AKAIKE INFORMATION CRITERION FOR VECTOR AUTOREGRESSIVE MODEL SELECTION
- Computational Efficiency in the Selection of Regression Variables
- On Information and Sufficiency
- A new look at the statistical model identification
This page was built for publication: A multistage algorithm for best-subset model selection based on the Kullback-Leibler discrepancy