Low information omnibus (LIO) priors for Dirichlet process mixture models
DOI10.1214/18-BA1119zbMATH Open1421.62078OpenAlexW2893855365WikidataQ129201740 ScholiaQ129201740MaRDI QIDQ2316979FDOQ2316979
Authors: Yushu Shi, Michael J. Martens, A. Banerjee, Purushottam W. Laud
Publication date: 7 August 2019
Published in: Bayesian Analysis (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.ba/1560240023
Recommendations
- Variational inference for Dirichlet process mixtures
- On selecting a prior for the precision parameter of Dirichlet process mixture models
- Marginal Likelihood and Bayes Factors for Dirichlet Process Mixture Models
- scientific article; zbMATH DE number 1559131
- Subjective priors for the dirichlet process
- Selecting the precision parameter prior in Dirichlet process mixture models
- Mean field inference for the Dirichlet process mixture model
density estimationsurvival analysisDirichlet process mixture modelBayesian nonparametric methodslow-information prior
Density estimation (62G07) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Reliability and life testing (62N05)
Cites Work
- Bayesian data analysis.
- Bayesian Density Estimation and Inference Using Mixtures
- An ANOVA Model for Dependent Random Measures
- Mixtures of Dirichlet processes with applications to Bayesian nonparametric problems
- On posterior consistency in nonparametric regression problems
- A Bayesian analysis of some nonparametric problems
- Hierarchical Mixture Modeling With Normalized Inverse-Gaussian Priors
- Nonparametric Estimation of a Survivorship Function with Doubly Censored Data
- Title not available (Why is that?)
- A weakly informative default prior distribution for logistic and other regression models
- Distributional results for means of normalized random measures with independent increments
- Investigating nonparametric priors with Gibbs structure
- Posterior Analysis for Normalized Random Measures with Independent Increments
- On a class of Bayesian nonparametric estimates: I. Density estimates
- Posterior consistency of Dirichlet mixtures in density estimation
- Title not available (Why is that?)
- New approaches to Bayesian consistency
- Kullback Leibler property of kernel mixture priors in Bayesian density estimation
- A blocked Gibbs sampler for NGG-mixture models via a priori truncation
- Controlling the Reinforcement in Bayesian Non-Parametric Mixture Models
- Bayesian density estimation and model selection using nonparametric hierarchical mixtures
- The \(L_{1}\)-consistency of Dirichlet mixtures in multivariate Bayesian density estimation
- Modeling Unobserved Sources of Heterogeneity in Animal Abundance Using a Dirichlet Process Prior
- Nonparametric Bayesian survival analysis using mixtures of Weibull distributions
- Low information omnibus (LIO) priors for Dirichlet process mixture models
- Robustifying Bayesian nonparametric mixtures for count data
Cited In (8)
- A Robustified Posterior for Bayesian Inference on a Large Number of Parallel Effects
- Dirichlet process mixtures under affine transformations of the data
- Nonparametric failure time: time-to-event machine learning with heteroskedastic Bayesian additive regression trees and low information omnibus Dirichlet process mixtures
- Low information omnibus (LIO) priors for Dirichlet process mixture models
- A review of uncertainty quantification for density estimation
- A Dirichlet process mixture model for non-ignorable dropout
- On the inferential implications of decreasing weight structures in mixture models
- A dependent Dirichlet process model for survival data with competing risks
Uses Software
This page was built for publication: Low information omnibus (LIO) priors for Dirichlet process mixture models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2316979)