Asymptotic bootstrap corrections of AIC for linear regression models
From MaRDI portal
Publication:1048800
DOI10.1016/j.sigpro.2009.06.010zbMath1177.94071OpenAlexW1995714794WikidataQ118322150 ScholiaQ118322150MaRDI QIDQ1048800
Publication date: 8 January 2010
Published in: Signal Processing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.sigpro.2009.06.010
Bootstrap, jackknife and other resampling methods (62F40) Estimation and detection in stochastic control theory (93E10) Signal theory (characterization, reconstruction, filtering, etc.) (94A12) Statistical aspects of information-theoretic topics (62B10)
Related Items (7)
The comparison study of the model selection criteria on the Tobit regression model based on the bootstrap sample augmentation mechanisms ⋮ Identification of Directed Influence: Granger Causality, Kullback-Leibler Divergence, and Complexity ⋮ Bootstrap-based model selection criteria for beta regressions ⋮ Improving the Incoherence of a Learned Dictionary via Rank Shrinkage ⋮ Variable selection in linear regression: several approaches based on normalized maximum likelihood ⋮ A cluster tree based model selection approach for logistic regression classifier ⋮ Beta seasonal autoregressive moving average models
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Bayesian estimation of the number of principal components
- Modeling by shortest data description
- Asymptotically efficient selection of the order of the model for estimating parameters of a linear process
- Estimating the dimension of a model
- A large-sample model selection criterion based on Kullback's symmetric divergence
- Statistical predictor identification
- Estimating the Error Rate of a Prediction Rule: Improvement on Cross-Validation
- Filter-order selection in adaptive maximum likelihood estimation
- On information theoretic criteria for determining the number of signals in high resolution array processing
- How Biased is the Apparent Error Rate of a Prediction Rule?
- Regression and time series model selection in small samples
- Bootstrapping State-Space Models: Gaussian Maximum Likelihood Estimation and the Kalman Filter
- Selection of the order of an autoregressive model by Akaike's information criterion
- Further analysis of the data by Akaike's information criterion and the finite corrections
- Minimax description length for signal denoising and optimized representation
- Linear Model Selection by Cross-Validation
- A Small Sample Model Selection Criterion Based on Kullback's Symmetric Divergence
- Some Comments on C P
- On Information and Sufficiency
- An invariant form for the prior probability in estimation problems
- A new look at the statistical model identification
This page was built for publication: Asymptotic bootstrap corrections of AIC for linear regression models