A CORRECTED AKAIKE INFORMATION CRITERION FOR VECTOR AUTOREGRESSIVE MODEL SELECTION
From MaRDI portal
Publication:5285834
Recommendations
- A small-sample criterion based on Kullback's symmetric divergence for vector autoregressive modeling
- Order selection criteria for vector autoregressive models
- Regression and time series model selection in small samples
- The model selection criterion AICu.
- A derivation of the information criteria for selecting autoregressive models
Cites work
- scientific article; zbMATH DE number 3854249 (Why is no real title available?)
- scientific article; zbMATH DE number 4062374 (Why is no real title available?)
- scientific article; zbMATH DE number 4088698 (Why is no real title available?)
- A new look at the statistical model identification
- Asymptotically efficient selection of the order of the model for estimating parameters of a linear process
- Estimating the dimension of a model
- Fitting autoregressive models for prediction
- The exact likelihood function of multivariate autoregressive-moving average models
- Time series: theory and methods.
Cited in
(39)- Model selection for independent not identically distributed observations based on Rényi's pseudodistances
- A corrected Akaike criterion based on Kullback's symmetric divergence: applications in time series, multiple and multivariate regression
- Information criteria for Fay-Herriot model selection
- Sieve bootstrap for functional time series
- A derivation of the information criteria for selecting autoregressive models
- Autoregressive model order selection by a finite sample estimator for the Kullback-Leibler discrepancy
- Unifying the derivations for the Akaike and corrected Akaike information criteria.
- Bootstrap variants of the Akaike information criterion for mixed model selection
- Estimating the orders of weak multivariate ARMA models
- Multivariate regression model selection from small samples using Kullback's symmetric divergence
- A note on the corrected Akaike information criterion for threshold autoregressive models
- Order selection criteria for vector autoregressive models
- Identifying the number of components in Gaussian mixture models using numerical algebraic geometry
- Skewness-adjusted bootstrap confidence intervals and confidence bands for impulse response functions
- scientific article; zbMATH DE number 1911039 (Why is no real title available?)
- Is first-order vector autoregressive model optimal for fMRI data?
- Modification of the Mallows-Akaike criterion for selecting the order of a regression model
- Insights into the mechanisms of thymus involution and regeneration by modeling the glucocorticoid-induced perturbation of thymocyte populations dynamics
- Modified Schwarz and Hannan-Quinn information criteria for weak VARMA models
- AIC, overfitting principles, and the boundedness of moments of inverse matrices for vector autotregressions and related models.
- Bivariate exponentiated discrete Weibull distribution: statistical properties, estimation, simulation and applications
- An improved Akaike information criterion for state-space model selection
- A multistage algorithm for best-subset model selection based on the Kullback-Leibler discrepancy
- A small-sample criterion based on Kullback's symmetric divergence for vector autoregressive modeling
- Sieve bootstrapping the memory parameter in long-range dependent stationary functional time series
- scientific article; zbMATH DE number 7647984 (Why is no real title available?)
- Akaike's information criterion correction for the least-squares autoregressive spectral estimator
- Modeling volatility using state space models with heavy tailed distributions
- Optimal lag-length choice in stable and unstable VAR models under situations of homoscedasticity and ARCH
- A comparison of some criteria for states selection in the latent Markov model for longitudinal data
- Mortality projections for higher educational attainment with semi-parametric accelerated hazard relational models
- Asymptotic theory for information criteria in model selection -- functional approach
- Subspace information criterion for model selection
- Bootstrap prediction bands for forecast paths from vector autoregressive models
- A large-sample model selection criterion based on Kullback's symmetric divergence
- Jointly determining the state dimension and lag order for Markov-switching vector autoregressive models
- Bootstrap Prediction Bands for Functional Time Series
- Identification of directed influence: Granger causality, Kullback-Leibler divergence, and complexity
- Selection of weak VARMA models by modified Akaike's information criteria
This page was built for publication: A CORRECTED AKAIKE INFORMATION CRITERION FOR VECTOR AUTOREGRESSIVE MODEL SELECTION
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5285834)