Principled selection of hyperparameters in the latent Dirichlet allocation model
From MaRDI portal
Publication:4558486
zbMATH Open1471.62283MaRDI QIDQ4558486FDOQ4558486
Authors: Clint P. George, Hani Doss
Publication date: 22 November 2018
Full work available at URL: http://jmlr.csail.mit.edu/papers/v18/15-595.html
Recommendations
Markov chain Monte Carlomodel selectionlatent Dirichlet allocationempirical Bayes inferencetopic modelling
Computational methods in Markov chains (60J22) Bayesian inference (62F15) Learning and adaptive systems in artificial intelligence (68T05)
Cites Work
- Graphical models, exponential families, and variational inference
- Title not available (Why is that?)
- Covariance structure of the Gibbs sampler with applications to the comparisons of estimators and augmentation schemes
- Title not available (Why is that?)
- Markov chain Monte Carlo: can we trust the third significant figure?
- Monte Carlo sampling methods using Markov chains and their applications
- Calibration and empirical Bayes variable selection
- 10.1162/jmlr.2003.3.4-5.993
- Fixed-Width Output Analysis for Markov Chain Monte Carlo
- Hierarchical Dirichlet Processes
- Annealing Markov Chain Monte Carlo with Applications to Ancestral Inference
- On the Bernstein-von Mises theorem with infinite-dimensional parameters
- The Effect of Improper Priors on Gibbs Sampling in Hierarchical Linear Mixed Models
- Convergence of the Monte Carlo expectation maximization for curved exponential families.
- An introduction to MCMC for machine learning
- Gibbs sampling, exponential families and orthogonal polynomials
- An MCMC approach to empirical Bayes inference and Bayesian sensitivity analysis via empirical processes
- Distributed algorithms for topic models
Cited In (13)
- The exact asymptotic form of Bayesian generalization error in latent Dirichlet allocation
- Selection of Proposal Distributions for Multiple Importance Sampling
- Inference for the Number of Topics in the Latent Dirichlet Allocation Model via Bayesian Mixture Modeling
- Robust initialization for learning latent Dirichlet allocation
- Dense distributions from sparse samples: improved Gibbs sampling parameter estimators for LDA
- Online but accurate inference for latent variable models with local Gibbs sampling
- Distributed algorithms for topic models
- DOLDA: a regularized supervised topic model for high-dimensional multi-class regression
- Scalable empirical Bayes inference and Bayesian sensitivity analysis
- 10.1162/jmlr.2003.3.4-5.993
- Scalable Hyperparameter Selection for Latent Dirichlet Allocation
- Learning in volatile environments with the Bayes factor surprise
- A correlated topic model of science
Uses Software
This page was built for publication: Principled selection of hyperparameters in the latent Dirichlet allocation model
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4558486)