Equations of states in singular statistical estimation

From MaRDI portal
Publication:1784533

DOI10.1016/J.NEUNET.2009.08.002zbMATH Open1396.68106DBLPjournals/nn/Watanabe10arXiv0712.0653OpenAlexW2055505622WikidataQ51798993 ScholiaQ51798993MaRDI QIDQ1784533FDOQ1784533


Authors: Sumio Watanabe Edit this on Wikidata


Publication date: 27 September 2018

Published in: Neural Networks (Search for Journal in Brave)

Abstract: Learning machines which have hierarchical structures or hidden variables are singular statistical models because they are nonidentifiable and their Fisher information matrices are singular. In singular statistical models, neither the Bayes a posteriori distribution converges to the normal distribution nor the maximum likelihood estimator satisfies asymptotic normality. This is the main reason why it has been difficult to predict their generalization performances from trained states. In this paper, we study four errors, (1) Bayes generalization error, (2) Bayes training error, (3) Gibbs generalization error, and (4) Gibbs training error, and prove that there are mathematical relations among these errors. The formulas proved in this paper are equations of states in statistical estimation because they hold for any true distribution, any parametric model, and any a priori distribution. Also we show that Bayes and Gibbs generalization errors are estimated by Bayes and Gibbs training errors, and propose widely applicable information criteria which can be applied to both regular and singular statistical models.


Full work available at URL: https://arxiv.org/abs/0712.0653




Recommendations



Cites Work


Cited In (13)





This page was built for publication: Equations of states in singular statistical estimation

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1784533)