A Bayesian analysis of the minimum AIC procedure

From MaRDI portal
Publication:1143080

DOI10.1007/BF02480194zbMath0441.62007OpenAlexW4230211598WikidataQ61440891 ScholiaQ61440891MaRDI QIDQ1143080

Hirotugu Akaike

Publication date: 1978

Published in: Annals of the Institute of Statistical Mathematics (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1007/bf02480194



Related Items

Unnamed Item, A procedure for estimating the number of clusters in logistic regression clustering, Some new model selection criteria in simple regression, Model selection for forecasting, Minimal penalties for Gaussian model selection, A note on choosing the number of factors, Generalized maximum entropy based identification of graphical ARMA models, The weighted likelihood, Motor Unit Number Estimation—A Bayesian Approach, A generalization of the logistic linear'model, Asymptotically optimal selection of a piecewise polynomial estimator of a regression function, Dimension reduction transfer function model, FACAIC: Model selection algorithm for the orthogonal factor model using AIC and CAIC, Generalized information criterion, Statistical Problem Classes and Their Links to Information Theory, Approximate Bayes model selection procedures for Gibbs-Markov random fields, Bayesian inference: Weibull Poisson model for censored data using the expectation–maximization algorithm and its application to bladder cancer data, The model selection criterion AICu., Testing temporal constancy of the spectral structure of a time series, Information theory as a unifying statistical approach for use in marketing research, Bootstrap-based testing inference in beta regressions, Forecasting using predictive likelihood model averaging, A procedure for variable selection in double generalized linear models, Model selection in multivariate adaptive regression splines (MARS) using information complexity as the fitness function, Forecasting the proportion of stored energy using the unit Burr XII quantile autoregressive moving average model, Markov-modulated Hawkes process with stepwise decay, Does a Bayesian approach generate robust forecasts? Evidence from applications in portfolio investment decisions, Likelihood of a model and information criteria, A comparison of the information and posterior probability criteria for model selection, Model selection criteria in beta regression with varying dispersion, Combining the data from two normal populations to estimate the mean of one when their means difference is bounded, Time-varying coefficient models with ARMA–GARCH structures for longitudinal data analysis, An improved Akaike information criterion for state-space model selection, Model selection criteria for the leads-and-lags cointegrating regression, Statistical inference for multiple choice tests, A deviance-based criterion for model selection in GLM, Estimating the number of signals of the damped exponential models., Model selection for estimating the non zero components of a Gaussian vector, Maximum likelihood characterization of distributions, Optimality of AIC in inference about Brownian motion, Using a Mixture Model for Multiple Imputation in the Presence of Outliers: The ‘Healthy for Life’ Project, The estimation of frequency in the multichannel sinusoidal model, Gaussian model selection with an unknown variance, A consistent model selection procedure for Markov random fields based on penalized pseudolikelihood, ARMA process for speckled data, The Doubly Adaptive LASSO for Vector Autoregressive Models, HODE: Hidden One-Dependence Estimator, Determining the order of the functional autoregressive model, Testing for serial correlation in multivariate regression models, Asymptotic mean efficiency of a selection of regression variables, Model selection for factor analysis: Some new criteria and performance comparisons, On the selection of subset bilinear time series models: a genetic algorithm approach, A new method to discriminate between enzyme-kinetic models



Cites Work