A non-asymptotic approach for model selection via penalization in high-dimensional mixture of experts models
DOI10.1214/22-EJS2057MaRDI QIDQ2084460
Hien Duy Nguyen, Florence Forbes, TrungTin Nguyen, Faicel Chamroukhi
Publication date: 18 October 2022
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2104.02640
model selectionclusteringmixture of expertsoracle inequalitygraphical Lassopenalized maximum likelihoodmixture of regressionsblock-diagonal covariance matrixGaussian locally-linear mapping modelslinear cluster-weighted models
Estimation in multivariate analysis (62H12) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Approximations to statistical distributions (nonasymptotic) (62E17)
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- High-dimensional regression with Gaussian mixtures and partially-latent response variables
- Local statistical modeling via a cluster-weighted approach with elliptical distributions
- Kullback-Leibler aggregation and misspecified generalized linear models
- Mixture of Gaussian regressions model with logistic weights, a penalized maximum likelihood approach
- Convergence rates of parameter estimation for some weakly identifiable finite mixtures
- \(\ell_{1}\)-penalization for mixture regression models
- Adaptive Dantzig density estimation
- Slope heuristics: overview and implementation
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- Mixtures of regressions with changepoints
- Finite mixture regression: a sparse variable selection by model selection for clustering
- Approximation of conditional densities by smooth mixtures of regressions
- SPADES and mixture models
- On the theory of elliptically contoured distributions
- Estimating the dimension of a model
- Minimum contrast estimators on sieves: Exponential bounds and rates of convergence
- Hierarchical mixtures-of-experts for exponential family regression models: Approximation and maximum likelihood estimation
- Inverse regression approach to robust nonlinear high-to-low dimensional mapping
- Optimal Kullback-Leibler aggregation in mixture density estimation by maximum likelihood
- Rates of convergence for the Gaussian mixture sieve.
- Weak convergence and empirical processes. With applications to statistics
- The Lasso as an \(\ell _{1}\)-ball model selection procedure
- Convergence of latent mixing measures in finite and infinite mixture models
- Approximations of conditional probability density functions in Lebesgue spaces via mixture of experts models
- Minimal penalties for Gaussian model selection
- Optimal exponential bounds for aggregation of estimators for the Kullback-Leibler loss
- Joint rank and variable selection for parsimonious estimation in a high-dimensional finite mixture regression model
- Model-based regression clustering for high-dimensional data: application to functional data
- Anℓ1-oracle inequality for the Lasso in multivariate finite mixture of multivariate Gaussian regression models
- On Convergence Rates of Mixtures of Polynomial Experts
- A Sparse PLS for Variable Selection when Integrating Omics Data
- POSTERIOR CONSISTENCY IN CONDITIONAL DENSITY ESTIMATION BY COVARIATE DEPENDENT MIXTURES
- Mixture Densities, Maximum Likelihood and the EM Algorithm
- Risk bounds for mixture density estimation
- Sliced Inverse Regression for Dimension Reduction
- Model Selection and Multimodel Inference
- Block-Diagonal Covariance Selection for High-Dimensional Gaussian Graphical Models
- Multivariate extremes, aggregation and dependence in elliptical distributions
- On-line EM Algorithm for the Normalized Gaussian Network
- Multivariate T-Distributions and Their Applications
- A non asymptotic penalized criterion for Gaussian mixture model selection
- Data-driven penalty calibration: A case study for Gaussian mixture model selection
- Approximation by finite mixtures of continuous density functions that vanish at infinity
- Minimal penalties and the slope heuristics: a survey
- ADAPTIVE BAYESIAN ESTIMATION OF CONDITIONAL DENSITIES
- A Universal Approximation Theorem for Mixture-of-Experts Models
- Anℓ1-oracle inequality for the Lasso in finite mixture Gaussian regression models
- Partition-based conditional density estimation
- Adaptive density estimation for clustering with Gaussian mixtures
- Some Comments on C P
- On the Conditional Distribution of the Multivariate t Distribution
- Maximum Likelihood Estimation of Misspecified Models
- On strong identifiability and convergence rates of parameter estimation in finite mixtures
- A new look at the statistical model identification
- Approximation of probability density functions via location-scale finite mixtures in Lebesgue spaces
This page was built for publication: A non-asymptotic approach for model selection via penalization in high-dimensional mixture of experts models