Regularized Estimation and Feature Selection in Mixtures of Gaussian-Gated Experts Models
DOI10.1007/978-981-15-1960-4_3zbMATH Open1445.62325arXiv1909.05494OpenAlexW3007594395MaRDI QIDQ3305484FDOQ3305484
Authors: Faicel Chamroukhi, Florian Lecocq, Hien D. Nguyen
Publication date: 7 August 2020
Published in: Communications in Computer and Information Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1909.05494
Recommendations
- Regularized estimation and feature selection in mixtures of experts
- New estimation and feature selection methods in mixture-of-experts models
- Regularization and selection in Gaussian mixture of autoregressive models
- RBF nets, mixture experts, and Bayesian Ying-Yang learning
- AI 2005: Advances in Artificial Intelligence
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Statistical aspects of big data and data science (62R07) Ridge regression; shrinkage estimators (Lasso) (62J07)
Cites Work
- Pathwise coordinate optimization
- Coordinate descent algorithms for lasso penalized regression
- Penalized model-based clustering with application to variable selection
- Title not available (Why is that?)
- Finite mixture models
- Title not available (Why is that?)
- The EM Algorithm and Extensions, 2E
- Title not available (Why is that?)
- Time series modeling by a regression approach based on a latent process
- Laplace mixture of linear experts
- New estimation and feature selection methods in mixture-of-experts models
- Rejoinder to the comments on: \(\ell _{1}\)-penalization for mixture regression models
- Robust mixture of experts modeling using the \(t\) distribution
- Regularized estimation and feature selection in mixtures of experts
Cited In (4)
This page was built for publication: Regularized Estimation and Feature Selection in Mixtures of Gaussian-Gated Experts Models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3305484)