A universal approximation theorem for mixture-of-experts models
DOI10.1162/NECO_A_00892zbMATH Open1474.68266arXiv1602.03683OpenAlexW2270588982WikidataQ39391760 ScholiaQ39391760MaRDI QIDQ5380595FDOQ5380595
Authors: Luke R. Lloyd-Jones, Hien D. Nguyen, Geoffrey J. McLachlan
Publication date: 5 June 2019
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1602.03683
Recommendations
- Approximations of conditional probability density functions in Lebesgue spaces via mixture of experts models
- Error bounds for functional approximation and estimation using mixtures of experts
- A flexible probabilistic framework for large-margin mixture of experts
- Mixture of experts architectures for neural networks as a special case of conditional expectation formula.
- Hierarchical mixtures-of-experts for exponential family regression models: Approximation and maximum likelihood estimation
Classification and discrimination; cluster analysis (statistical aspects) (62H30) General nonlinear regression (62J02) Learning and adaptive systems in artificial intelligence (68T05)
Cites Work
- Fitting finite mixtures of generalized linear regressions in \textsf{R}
- Estimating the dimension of a model
- Finite mixture models
- Hierarchical mixtures-of-experts for exponential family regression models: Approximation and maximum likelihood estimation
- Laplace mixture of linear experts
- New estimation and feature selection methods in mixture-of-experts models
- Approximation by superpositions of a sigmoidal function
- Approximation of conditional densities by smooth mixtures of regressions
- On the asymptotic normality of hierarchical mixtures-of-experts for generalized linear models
- Error bounds for functional approximation and estimation using mixtures of experts
- On convergence rates of mixtures of polynomial experts
Cited In (8)
- Uniform consistency in nonparametric mixture models
- A class of mixture of experts models for general insurance: theoretical developments
- Laplace mixture of linear experts
- Title not available (Why is that?)
- Conditional sum-product networks: modular probabilistic circuits via gate functions
- A non-asymptotic approach for model selection via penalization in high-dimensional mixture of experts models
- Error bounds for functional approximation and estimation using mixtures of experts
- Approximations of conditional probability density functions in Lebesgue spaces via mixture of experts models
Uses Software
This page was built for publication: A universal approximation theorem for mixture-of-experts models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5380595)