A derivation of the information criteria for selecting autoregressive models
From MaRDI portal
Publication:3729863
DOI10.2307/1427304zbMath0596.62082OpenAlexW4231716860MaRDI QIDQ3729863
Publication date: 1986
Published in: Advances in Applied Probability (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.2307/1427304
informationloss functionAkaike information criterionAICKullback-Leiblerautoregressive processrisk functionorder determinationasymptotically unbiased estimatorAIC(alpha)
Time series, auto-correlation, regression, etc. in statistics (GARCH) (62M10) Non-Markovian processes: estimation (62M09)
Related Items (5)
An automatic portmanteau test for serial correlation ⋮ Model selection for infinite variance time series ⋮ Genetic algorithms for the identification of additive and innovation outliers in time series ⋮ Model selection in orthogonal regression ⋮ AIC, overfitting principles, and the boundedness of moments of inverse matrices for vector autotregressions and related models.
This page was built for publication: A derivation of the information criteria for selecting autoregressive models