A similarity-based Bayesian mixture-of-experts model

From MaRDI portal
Publication:6173564

DOI10.1007/S11222-023-10238-YzbMATH Open1517.62051arXiv2012.02130MaRDI QIDQ6173564FDOQ6173564


Authors: Tianfang Zhang, Rasmus Bokrantz, Jimmy Olsson Edit this on Wikidata


Publication date: 21 July 2023

Published in: Statistics and Computing (Search for Journal in Brave)

Abstract: We present a new nonparametric mixture-of-experts model for multivariate regression problems, inspired by the probabilistic k-nearest neighbors algorithm. Using a conditionally specified model, predictions for out-of-sample inputs are based on similarities to each observed data point, yielding predictive distributions represented by Gaussian mixtures. Posterior inference is performed on the parameters of the mixture components as well as the distance metric using a mean-field variational Bayes algorithm accompanied with a stochastic gradient-based optimization procedure. The proposed method is especially advantageous in settings where inputs are of relatively high dimension in comparison to the data size, where input-output relationships are complex, and where predictive distributions may be skewed or multimodal. Computational studies on five datasets, of which two are synthetically generated, illustrate clear advantages of our mixture-of-experts method for high-dimensional inputs, outperforming competitor models both in terms of validation metrics and visual inspection.


Full work available at URL: https://arxiv.org/abs/2012.02130




Recommendations




Cites Work


Cited In (4)





This page was built for publication: A similarity-based Bayesian mixture-of-experts model

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6173564)