Finite mixture regression: a sparse variable selection by model selection for clustering
From MaRDI portal
Abstract: We consider a finite mixture of Gaussian regression model for high- dimensional data, where the number of covariates may be much larger than the sample size. We propose to estimate the unknown conditional mixture density by a maximum likelihood estimator, restricted on relevant variables selected by an 1-penalized maximum likelihood estimator. We get an oracle inequality satisfied by this estimator with a Jensen-Kullback-Leibler type loss. Our oracle inequality is deduced from a general model selection theorem for maximum likelihood estimators with a random model collection. We can derive the penalty shape of the criterion, which depends on the complexity of the random model collection.
Recommendations
- An \(\ell_{1}\)-oracle inequality for the Lasso in finite mixture Gaussian regression models
- Variable Selection in Finite Mixture of Regression Models
- \(\ell_{1}\)-penalization for mixture regression models
- An \(\ell_1\)-oracle inequality for the Lasso in multivariate finite mixture of multivariate Gaussian regression models
- Joint rank and variable selection for parsimonious estimation in a high-dimensional finite mixture regression model
Cites work
- scientific article; zbMATH DE number 5957408 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A new look at the statistical model identification
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- Estimating the dimension of a model
- Gaussian model selection with an unknown variance
- Information-theoretic determination of minimax rates of convergence
- Least squares after model selection in high-dimensional sparse models
- Minimal penalties for Gaussian model selection
- Nonparametric functional data analysis. Theory and practice.
- On the conditions used to prove oracle results for the Lasso
- Pairwise Variable Selection for High-Dimensional Model-Based Clustering
- Penalized model-based clustering with application to variable selection
- Penalized model-based clustering with unconstrained covariance matrices
- Rates of convergence for the Gaussian mixture sieve.
- Regularized \(k\)-means clustering of high-dimensional data and its asymptotic consistency
- Scaled sparse linear regression
- Simultaneous analysis of Lasso and Dantzig selector
- The Lasso as an \(\ell _{1}\)-ball model selection procedure
- \(\ell_{1}\)-penalization for mixture regression models
Cited in
(28)- Screening and clustering of sparse regressions with finite non-Gaussian mixtures
- A mixed integer linear model for clustering with variable selection
- Inverse regression approach to robust nonlinear high-to-low dimensional mapping
- Regression‐based heterogeneity analysis to identify overlapping subgroup structure in high‐dimensional data
- \(\ell_{1}\)-penalization for mixture regression models
- A Bayesian sparse finite mixture model for clustering data from a heterogeneous population
- On the choice of high-dimensional regression parameters in Gaussian random tomography
- Block-Wise Variable Selection for Clustering Via Latent States of Mixture Models
- Sparse oracle inequalities for variable selection via regularized quantization
- An \(\ell_{1}\)-oracle inequality for the Lasso in finite mixture Gaussian regression models
- scientific article; zbMATH DE number 2015574 (Why is no real title available?)
- Simultaneous variable selection and component selection for regression density estimation with mixtures of heteroscedastic experts
- Sufficient dimension reduction for clustered data via finite mixture modelling
- A non-asymptotic approach for model selection via penalization in high-dimensional mixture of experts models
- Clustering electricity consumers using high-dimensional regression mixture models
- Variable Selection for Clustering with Gaussian Mixture Models
- An \(\ell_1\)-oracle inequality for the Lasso in multivariate finite mixture of multivariate Gaussian regression models
- Feature selection in finite mixture of sparse normal linear models in high-dimensional feature space
- A non asymptotic penalized criterion for Gaussian mixture model selection
- Hypothesis testing in finite mixture of regressions: sparsity and model selection uncertainty
- Clusterwise elastic-net regression based on a combined information criterion
- On mixture regression shrinkage and selection via the MR-LASSO
- Approximations of conditional probability density functions in Lebesgue spaces via mixture of experts models
- Bayesian shrinkage in mixture-of-experts models: identifying robust determinants of class membership
- SPADES and mixture models
- Model-based regression clustering for high-dimensional data: application to functional data
- Joint rank and variable selection for parsimonious estimation in a high-dimensional finite mixture regression model
- Regularized parameter estimation in high-dimensional Gaussian mixture models
This page was built for publication: Finite mixture regression: a sparse variable selection by model selection for clustering
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q902208)