Convergence rates of latent topic models under relaxed identifiability conditions
From MaRDI portal
Publication:1711601
DOI10.1214/18-EJS1516zbMath1433.62092arXiv1710.11070OpenAlexW2964350306WikidataQ128626613 ScholiaQ128626613MaRDI QIDQ1711601
Publication date: 18 January 2019
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1710.11070
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Global identifiability of linear structural equation models
- Borrowing strengh in hierarchical Bayes: posterior concentration of the Dirichlet base measure
- Sharper bounds for Gaussian and empirical processes
- Optimal rate of convergence for finite mixture models
- Convergence of latent mixing measures in finite and infinite mixture models
- Posterior contraction of the population polytope in finite admixture models
- A spectral algorithm for latent Dirichlet allocation
- Tensor decompositions for learning latent variable models
- Learning Mixtures of Gaussians in High Dimensions
- Learning mixtures of spherical gaussians
- Condition Estimates
- Algebraic Geometry and Statistical Learning Theory
- Asymptotic Statistics
- Algebraic Problems in Structural Equation Modeling
- 10.1162/jmlr.2003.3.4-5.993
- Singularity Structures and Impacts on Parameter Estimation in Finite Mixtures of Distributions
- Identifiability of directed Gaussian graphical models with one latent source
This page was built for publication: Convergence rates of latent topic models under relaxed identifiability conditions