Regularized Estimation and Feature Selection in Mixtures of Gaussian-Gated Experts Models

From MaRDI portal
Publication:3305484

DOI10.1007/978-981-15-1960-4_3zbMATH Open1445.62325arXiv1909.05494OpenAlexW3007594395MaRDI QIDQ3305484FDOQ3305484


Authors: Faicel Chamroukhi, Florian Lecocq, Hien D. Nguyen Edit this on Wikidata


Publication date: 7 August 2020

Published in: Communications in Computer and Information Science (Search for Journal in Brave)

Abstract: Mixtures-of-Experts models and their maximum likelihood estimation (MLE) via the EM algorithm have been thoroughly studied in the statistics and machine learning literature. They are subject of a growing investigation in the context of modeling with high-dimensional predictors with regularized MLE. We examine MoE with Gaussian gating network, for clustering and regression, and propose an ell1-regularized MLE to encourage sparse models and deal with the high-dimensional setting. We develop an EM-Lasso algorithm to perform parameter estimation and utilize a BIC-like criterion to select the model parameters, including the sparsity tuning hyperparameters. Experiments conducted on simulated data show the good performance of the proposed regularized MLE compared to the standard MLE with the EM algorithm.


Full work available at URL: https://arxiv.org/abs/1909.05494




Recommendations




Cites Work


Cited In (4)





This page was built for publication: Regularized Estimation and Feature Selection in Mixtures of Gaussian-Gated Experts Models

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3305484)