Asymptotic accuracy of Bayesian estimation for a single latent variable
From MaRDI portal
Publication:1669142
DOI10.1016/J.NEUNET.2015.04.014zbMATH Open1394.68324DBLPjournals/nn/Yamazaki15aarXiv1408.5661OpenAlexW2224858857WikidataQ40892228 ScholiaQ40892228MaRDI QIDQ1669142FDOQ1669142
Authors: Keisuke Yamazaki
Publication date: 30 August 2018
Published in: Neural Networks (Search for Journal in Brave)
Abstract: In data science and machine learning, hierarchical parametric models, such as mixture models, are often used. They contain two kinds of variables: observable variables, which represent the parts of the data that can be directly measured, and latent variables, which represent the underlying processes that generate the data. Although there has been an increase in research on the estimation accuracy for observable variables, the theoretical analysis of estimating latent variables has not been thoroughly investigated. In a previous study, we determined the accuracy of a Bayes estimation for the joint probability of the latent variables in a dataset, and we proved that the Bayes method is asymptotically more accurate than the maximum-likelihood method. However, the accuracy of the Bayes estimation for a single latent variable remains unknown. In the present paper, we derive the asymptotic expansions of the error functions, which are defined by the Kullback-Leibler divergence, for two types of single-variable estimations when the statistical regularity is satisfied. Our results indicate that the accuracies of the Bayes and maximum-likelihood methods are asymptotically equivalent and clarify that the Bayes method is only advantageous for multivariable estimations.
Full work available at URL: https://arxiv.org/abs/1408.5661
Recommendations
- Bayesian estimation of multidimensional latent variables and its asymptotic accuracy
- Accuracy of latent-variable estimation in Bayesian semi-supervised learning
- Asymptotic accuracy of distribution-based estimation of latent variables
- Asymptotic accuracy of Bayes estimation for latent variables with redundancy
- Effects of additional data on Bayesian clustering
Cites Work
- Estimating the dimension of a model
- Asymptotic Statistics
- Maximum Likelihood Estimation of Misspecified Models
- A new look at the statistical model identification
- Convergence rates of posterior distributions.
- Convergence of estimates under dimensionality restrictions
- Title not available (Why is that?)
- Bayesian predictive information criterion for the evaluation of hierarchical Bayesian and empirical Bayes models
- Singularities in mixture models and upper bounds of stochastic complexity.
- Convergence of latent mixing measures in finite and infinite mixture models
- Resolution of singularities of an algebraic variety over a field of characteristic zero. I
- Algebraic analysis for nonidentifiable learning machines
- Algebraic Geometry and Statistical Learning Theory
- An asymptotic behaviour of the marginal likelihood for general Markov models
- Note on the Consistency of the Maximum Likelihood Estimate
- Information-theoretic asymptotics of Bayes methods
- Selecting models from data: artificial intelligence and statistics IV. Selected papers presented at the fourth international workshop on artificial intelligence and statistics held in January 1993
Cited In (5)
- Asymptotic accuracy of Bayes estimation for latent variables with redundancy
- Accuracy of latent-variable estimation in Bayesian semi-supervised learning
- Asymptotic accuracy of distribution-based estimation of latent variables
- Bayesian estimation of multidimensional latent variables and its asymptotic accuracy
- Effects of additional data on Bayesian clustering
This page was built for publication: Asymptotic accuracy of Bayesian estimation for a single latent variable
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1669142)