Hierarchical Dirichlet scaling process

From MaRDI portal
Publication:2014581

DOI10.1007/S10994-016-5621-5zbMATH Open1419.62062DBLPjournals/ml/KimO17arXiv1404.1282OpenAlexW3122242054WikidataQ59521370 ScholiaQ59521370MaRDI QIDQ2014581FDOQ2014581


Authors: Dongwoo Kim, Alice Oh Edit this on Wikidata


Publication date: 25 August 2017

Published in: Machine Learning (Search for Journal in Brave)

Abstract: We present the extit{hierarchical Dirichlet scaling process} (HDSP), a Bayesian nonparametric mixed membership model. The HDSP generalizes the hierarchical Dirichlet process (HDP) to model the correlation structure between metadata in the corpus and mixture components. We construct the HDSP based on the normalized gamma representation of the Dirichlet process, and this construction allows incorporating a scaling function that controls the membership probabilities of the mixture components. We develop two scaling methods to demonstrate that different modeling assumptions can be expressed in the HDSP. We also derive the corresponding approximate posterior inference algorithms using variational Bayes. Through experiments on datasets of newswire, medical journal articles, conference proceedings, and product reviews, we show that the HDSP results in a better predictive performance than labeled LDA, partially labeled LDA, and author topic model and a better negative review classification performance than the supervised topic model and SVM.


Full work available at URL: https://arxiv.org/abs/1404.1282




Recommendations




Cites Work


Cited In (6)

Uses Software





This page was built for publication: Hierarchical Dirichlet scaling process

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2014581)