Approximations of conditional probability density functions in Lebesgue spaces via mixture of experts models

From MaRDI portal
Publication:2129248

DOI10.1186/S40488-021-00125-0zbMATH Open1490.62162arXiv2012.02385OpenAlexW3188130019WikidataQ114061064 ScholiaQ114061064MaRDI QIDQ2129248FDOQ2129248

Geoffrey J. McLachlan, Faicel Chamroukhi, TrungTin Nguyen, Hien D. Nguyen

Publication date: 22 April 2022

Published in: Journal of Statistical Distributions and Applications (Search for Journal in Brave)

Abstract: Mixture of experts (MoE) models are widely applied for conditional probability density estimation problems. We demonstrate the richness of the class of MoE models by proving denseness results in Lebesgue spaces, when inputs and outputs variables are both compactly supported. We further prove an almost uniform convergence result when the input is univariate. Auxiliary lemmas are proved regarding the richness of the soft-max gating function class, and their relationships to the class of Gaussian gating functions.


Full work available at URL: https://arxiv.org/abs/2012.02385




Recommendations




Cites Work


Cited In (7)





This page was built for publication: Approximations of conditional probability density functions in Lebesgue spaces via mixture of experts models

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2129248)