High-dimensional graphs and variable selection with the Lasso
From MaRDI portal
Publication:2500458
Abstract: The pattern of zero entries in the inverse covariance matrix of a multivariate normal distribution corresponds to conditional independence restrictions between variables. Covariance selection aims at estimating those structural zeros from data. We show that neighborhood selection with the Lasso is a computationally attractive alternative to standard covariance selection for sparse high-dimensional graphs. Neighborhood selection estimates the conditional independence restrictions separately for each node in the graph and is hence equivalent to variable selection for Gaussian linear models. We show that the proposed neighborhood selection scheme is consistent for sparse high-dimensional graphs. Consistency hinges on the choice of the penalty parameter. The oracle value for optimal prediction does not lead to a consistent neighborhood estimate. Controlling instead the probability of falsely joining some distinct connectivity components of the graph, consistent estimation for sparse graphs is achieved (with exponential rates), even when the number of variables grows as the number of observations raised to an arbitrary power.
Recommendations
- Sparse inverse covariance estimation with the graphical lasso
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Sparse covariance thresholding for high-dimensional variable selection
- A note on the Lasso for Gaussian graphical model selection
- Efficient estimation of covariance selection models
Cites work
- scientific article; zbMATH DE number 469396 (Why is no real title available?)
- scientific article; zbMATH DE number 1134987 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A Statistical View of Some Chemometrics Regression Tools
- Asymptotics for Lasso-type estimators.
- Atomic decomposition by basis pursuit
- Dependency networks for inference, collaborative filtering, and data visualization
- Functional aggregation for nonparametric regression.
- Gaussian Markov distributions over finite graphs
- Introduction to Graphical Modelling
- Least angle regression. (With discussion)
- Linear Model Selection by Cross-Validation
- Model selection for Gaussian concentration graphs
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Weak convergence and empirical processes. With applications to statistics
Cited in
(only showing first 100 items - show all)- Time varying undirected graphs
- High-Dimensional Gaussian Graphical Regression Models with Covariates
- Regularization and variable selection in Heckman selection model
- Structured Lasso for regression with matrix covariates
- Consistent group selection in high-dimensional linear regression
- Covariance and precision matrix estimation for high-dimensional time series
- Penalized estimation in high-dimensional hidden Markov models with state-specific graphical models
- DASSO: Connections Between the Dantzig Selector and Lasso
- Variable selection in high-dimensional quantile varying coefficient models
- High-dimensional Ising model selection with Bayesian information criteria
- High-dimensional covariance matrix estimation with missing observations
- Modeling item-item similarities for personalized recommendations on Yahoo! front page
- Flexible covariance estimation in graphical Gaussian models
- Estimation of high-dimensional graphical models using regularized score matching
- Estimation of high-dimensional low-rank matrices
- Variable selection in nonparametric additive models
- Statistics for big data: a perspective
- Preconditioning the Lasso for sign consistency
- Oracle inequalities for the lasso in the Cox model
- Flexible and Interpretable Models for Survival Data
- Worst possible sub-directions in high-dimensional models
- Some sharp performance bounds for least squares regression with \(L_1\) regularization
- Structural pursuit over multiple undirected graphs
- Spike-and-Slab Group Lassos for Grouped Regression and Sparse Generalized Additive Models
- Stability
- A general theory of concave regularization for high-dimensional sparse estimation problems
- High-dimensional regression with unknown variance
- Gemini: graph estimation with matrix variate normal instances
- Review of statistical network analysis: models, algorithms, and software
- Variable selection for high dimensional multivariate outcomes
- Learning Sparse Causal Gaussian Networks With Experimental Intervention: Regularization and Coordinate Descent
- Relaxed Lasso
- Latent variable graphical model selection via convex optimization
- Conditional score matching for high-dimensional partial graphical models
- A selective review of group selection in high-dimensional models
- Higher criticism for large-scale inference, especially for rare and weak effects
- High-dimensional variable screening and bias in subsequent inference, with an empirical comparison
- Discussion: Latent variable graphical model selection via convex optimization
- Consistent high-dimensional Bayesian variable selection via penalized credible regions
- Estimating time-varying networks
- Likelihood-based selection and sharp parameter estimation
- Hypothesis Testing of Matrix Graph Model with Application to Brain Connectivity Analysis
- Estimation in high-dimensional linear models with deterministic design matrices
- High-dimensional regression in practice: an empirical study of finite-sample prediction, variable selection and ranking
- An Expectation Conditional Maximization Approach for Gaussian Graphical Models
- Tuning-free heterogeneous inference in massive networks
- Sparse Matrix Graphical Models
- Bootstrap inference for network construction with an application to a breast cancer microarray study
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- Robust regression through the Huber's criterion and adaptive lasso penalty
- High-dimensional structure estimation in Ising models: local separation criterion
- A semiparametric graphical modelling approach for large-scale equity selection
- Graphical-model based high dimensional generalized linear models
- Estimating high-dimensional intervention effects from observational data
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- Optimal detection of heterogeneous and heteroscedastic mixtures
- Fitting very large sparse Gaussian graphical models
- Consistency of Bayesian linear model selection with a growing number of parameters
- Estimating sparse precision matrix: optimal rates of convergence and adaptive estimation
- High-dimensional inference for personalized treatment decision
- Discussion: Latent variable graphical model selection via convex optimization
- Simultaneous inference for pairwise graphical models with generalized score matching
- Goodness-of-Fit Tests for High Dimensional Linear Models
- Layer-wise learning strategy for nonparametric tensor product smoothing spline regression and graphical models
- Estimation for high-dimensional linear mixed-effects models using \(\ell_1\)-penalization
- Structure estimation for discrete graphical models: generalized covariance matrices and their inverses
- Rejoinder: Latent variable graphical model selection via convex optimization
- Robust Gaussian graphical modeling via \(l_{1}\) penalization
- Stable graphical model estimation with random forests for discrete, continuous, and mixed variables
- Variable selection in model-based clustering and discriminant analysis with a regularization approach
- High-dimensional joint estimation of multiple directed Gaussian graphical models
- On model selection consistency of regularized M-estimators
- Random matrix theory in statistics: a review
- Estimation of Gaussian graphs by model selection
- The adaptive and the thresholded Lasso for potentially misspecified models (and a lower bound for the Lasso)
- Selection by partitioning the solution paths
- Bayesian variable selection for high dimensional generalized linear models: convergence rates of the fitted densities
- Transfer Learning under High-dimensional Generalized Linear Models
- Rejoinder: One-step sparse estimates in nonconcave penalized likelihood models
- Minimax risks for sparse regressions: ultra-high dimensional phenomenons
- A two-step method for estimating high-dimensional Gaussian graphical models
- Discussion: One-step sparse estimates in nonconcave penalized likelihood models
- Focused vector information criterion model selection and model averaging regression with missing response
- Variable selection, monotone likelihood ratio and group sparsity
- Bayesian high-dimensional screening via MCMC
- Estimation of high-dimensional partially-observed discrete Markov random fields
- Probabilistic graphical models and Markov networks
- Dynamic networks with multi-scale temporal structure
- Adjusting for high-dimensional covariates in sparse precision matrix estimation by \(\ell_1\)-penalization
- Fast and adaptive sparse precision matrix estimation in high dimensions
- General nonexact oracle inequalities for classes with a subexponential envelope
- Mirror averaging with sparsity priors
- Semiparametric regression models with additive nonparametric components and high dimensional parametric components
- Generalization of constraints for high dimensional regression problems
- Sparse regression learning by aggregation and Langevin Monte-Carlo
- Estimating spatial covariance using penalised likelihood with weightedL1penalty
- High-dimensional Gaussian graphical model selection: walk summability and local separation criterion
- Estimation and variable selection with exponential weights
- Sharp support recovery from noisy random measurements by \(\ell_1\)-minimization
- Coordinate ascent for penalized semiparametric regression on high-dimensional panel count data
This page was built for publication: High-dimensional graphs and variable selection with the Lasso
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2500458)