Iterative Bias Correction of the Cross-Validation Criterion
From MaRDI portal
Publication:2911707
DOI10.1111/j.1467-9469.2011.00754.xzbMath1246.62093OpenAlexW1813075113MaRDI QIDQ2911707
Hironori Fujisawa, Hirokazu Yanagihara
Publication date: 1 September 2012
Published in: Scandinavian Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1111/j.1467-9469.2011.00754.x
GICasymptotic expansionmodel selectionbias correctionEICbootstrap iterationleave-\(k\)-out cross-validation
Asymptotic properties of nonparametric inference (62G20) Nonparametric estimation (62G05) Nonparametric statistical resampling methods (62G09)
Related Items
Asymptotic biases of information and cross-validation criteria under canonical parametrization ⋮ Estimating the Kullback–Liebler risk based on multifold cross‐validation
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Bias correction of cross-validation criterion based on Kullback-Leibler information under a general condition
- A family of estimators for multivariate kurtosis in a nonnormal linear regression model
- Robust parameter estimation with a small bias against heavy contamination
- A corrected Akaike criterion based on Kullback's symmetric divergence: applications in time series, multiple and multivariate regression
- Bootstrapping log likelihood and EIC, an extension of AIC
- Bias correction of AIC in logistic regression models
- Asymptotic theory for information criteria in model selection -- functional approach
- Selection of smoothing parameters in \(B\)-spline nonparametric regression models using information criteria
- A large-sample model selection criterion based on Kullback's symmetric divergence
- Model selection via multifold cross validation
- Quadratic distances on probabilities: A unified foundation
- Information criteria and statistical modeling.
- Corrected version of \(AIC\) for selecting multivariate normal linear regression models in a general nonnormal case
- Robust estimation in the normal mixture model
- A Bias Correction for Cross-validation Bandwidth Selection when a Kernel Estimate is Based on Dependent Data
- Single-index model selections
- Second-order bias-corrected AIC in multivariate normal linear models under non-normality
- Estimating the Error Rate of a Prediction Rule: Improvement on Cross-Validation
- Cross-Validation of Regression Models
- Smoothing parameter selection in quasi-likelihood models
- Model Selection in High Dimensions: A Quadratic-Risk-Based Approach
- An extended quasi-likelihood function
- On bootstrap resampling and iteration
- Regression and time series model selection in small samples
- Further analysis of the data by Akaike's information criterion and the finite corrections
- Smoothing Parameter Selection in Nonparametric Regression Using an Improved Akaike Information Criterion
- Robust and efficient estimation by minimising a density power divergence
- A crossvalidatory AIC for hard wavelet thresholding in spatially adaptive function estimation
- Generalised information criteria in model selection
- Linear Model Selection by Cross-Validation
- On Information and Sufficiency
- Maximum Likelihood Estimation of Misspecified Models
- The bootstrap and Edgeworth expansion
- A new look at the statistical model identification