A CORRECTED AKAIKE INFORMATION CRITERION FOR VECTOR AUTOREGRESSIVE MODEL SELECTION
DOI10.1111/J.1467-9892.1993.TB00144.XzbMATH Open0768.62076OpenAlexW2003068901MaRDI QIDQ5285834FDOQ5285834
Authors: Clifford Hurvich, Chih-Ling Tsai
Publication date: 29 June 1993
Published in: Journal of Time Series Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1111/j.1467-9892.1993.tb00144.x
Recommendations
- A small-sample criterion based on Kullback's symmetric divergence for vector autoregressive modeling
- Order selection criteria for vector autoregressive models
- Regression and time series model selection in small samples
- The model selection criterion AICu.
- A derivation of the information criteria for selecting autoregressive models
vector autoregressive modelsAkaike information criterionMonte Carlo resultsorder selectionapproximately unbiased estimatorexpected Kullback-Leibler informationsmall-sample criterion
Cites Work
- Time series: theory and methods.
- Estimating the dimension of a model
- A new look at the statistical model identification
- Asymptotically efficient selection of the order of the model for estimating parameters of a linear process
- Title not available (Why is that?)
- Title not available (Why is that?)
- Fitting autoregressive models for prediction
- Title not available (Why is that?)
- The exact likelihood function of multivariate autoregressive-moving average models
Cited In (39)
- A corrected Akaike criterion based on Kullback's symmetric divergence: applications in time series, multiple and multivariate regression
- Information criteria for Fay-Herriot model selection
- Sieve bootstrap for functional time series
- A derivation of the information criteria for selecting autoregressive models
- Autoregressive model order selection by a finite sample estimator for the Kullback-Leibler discrepancy
- Unifying the derivations for the Akaike and corrected Akaike information criteria.
- Bootstrap variants of the Akaike information criterion for mixed model selection
- Estimating the orders of weak multivariate ARMA models
- A note on the corrected Akaike information criterion for threshold autoregressive models
- Multivariate regression model selection from small samples using Kullback's symmetric divergence
- Identifying the number of components in Gaussian mixture models using numerical algebraic geometry
- Order selection criteria for vector autoregressive models
- Title not available (Why is that?)
- Skewness-adjusted bootstrap confidence intervals and confidence bands for impulse response functions
- Is first-order vector autoregressive model optimal for fMRI data?
- Modification of the Mallows-Akaike criterion for selecting the order of a regression model
- Insights into the mechanisms of thymus involution and regeneration by modeling the glucocorticoid-induced perturbation of thymocyte populations dynamics
- Modified Schwarz and Hannan-Quinn information criteria for weak VARMA models
- AIC, overfitting principles, and the boundedness of moments of inverse matrices for vector autotregressions and related models.
- Bivariate exponentiated discrete Weibull distribution: statistical properties, estimation, simulation and applications
- An improved Akaike information criterion for state-space model selection
- A small-sample criterion based on Kullback's symmetric divergence for vector autoregressive modeling
- A multistage algorithm for best-subset model selection based on the Kullback-Leibler discrepancy
- Sieve bootstrapping the memory parameter in long-range dependent stationary functional time series
- Title not available (Why is that?)
- Akaike's information criterion correction for the least-squares autoregressive spectral estimator
- Modeling volatility using state space models with heavy tailed distributions
- Optimal lag-length choice in stable and unstable VAR models under situations of homoscedasticity and ARCH
- Mortality projections for higher educational attainment with semi-parametric accelerated hazard relational models
- A comparison of some criteria for states selection in the latent Markov model for longitudinal data
- Asymptotic theory for information criteria in model selection -- functional approach
- Subspace information criterion for model selection
- Bootstrap prediction bands for forecast paths from vector autoregressive models
- A large-sample model selection criterion based on Kullback's symmetric divergence
- Jointly determining the state dimension and lag order for Markov-switching vector autoregressive models
- Bootstrap Prediction Bands for Functional Time Series
- Identification of directed influence: Granger causality, Kullback-Leibler divergence, and complexity
- Selection of weak VARMA models by modified Akaike's information criteria
- Model selection for independent not identically distributed observations based on Rényi's pseudodistances
This page was built for publication: A CORRECTED AKAIKE INFORMATION CRITERION FOR VECTOR AUTOREGRESSIVE MODEL SELECTION
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5285834)