Multivariate regression model selection from small samples using Kullback's symmetric divergence
DOI10.1016/J.SIGPRO.2005.10.009zbMATH Open1172.94483OpenAlexW1990749325WikidataQ118322232 ScholiaQ118322232MaRDI QIDQ1031241FDOQ1031241
Publication date: 29 October 2009
Published in: Signal Processing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.sigpro.2005.10.009
Recommendations
- A small-sample criterion based on Kullback's symmetric divergence for vector autoregressive modeling
- A corrected Akaike criterion based on Kullback's symmetric divergence: applications in time series, multiple and multivariate regression
- Model selection criteria based on Kullback information measures for nonlinear regression
- The Kullback information criterion for mixture regression models
- A CORRECTED AKAIKE INFORMATION CRITERION FOR VECTOR AUTOREGRESSIVE MODEL SELECTION
Statistical aspects of information-theoretic topics (62B10) Linear regression; mixed models (62J05) Estimation in multivariate analysis (62H12) Signal theory (characterization, reconstruction, filtering, etc.) (94A12)
Cited In (4)
- A corrected Akaike criterion based on Kullback's symmetric divergence: applications in time series, multiple and multivariate regression
- A Small Sample Model Selection Criterion Based on Kullback's Symmetric Divergence
- A small-sample criterion based on Kullback's symmetric divergence for vector autoregressive modeling
- Kullback-Leibler divergence measure for multivariate skew-normal distributions
This page was built for publication: Multivariate regression model selection from small samples using Kullback's symmetric divergence
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1031241)