High-dimensional graphs and variable selection with the Lasso
From MaRDI portal
Publication:2500458
Abstract: The pattern of zero entries in the inverse covariance matrix of a multivariate normal distribution corresponds to conditional independence restrictions between variables. Covariance selection aims at estimating those structural zeros from data. We show that neighborhood selection with the Lasso is a computationally attractive alternative to standard covariance selection for sparse high-dimensional graphs. Neighborhood selection estimates the conditional independence restrictions separately for each node in the graph and is hence equivalent to variable selection for Gaussian linear models. We show that the proposed neighborhood selection scheme is consistent for sparse high-dimensional graphs. Consistency hinges on the choice of the penalty parameter. The oracle value for optimal prediction does not lead to a consistent neighborhood estimate. Controlling instead the probability of falsely joining some distinct connectivity components of the graph, consistent estimation for sparse graphs is achieved (with exponential rates), even when the number of variables grows as the number of observations raised to an arbitrary power.
Recommendations
- Sparse inverse covariance estimation with the graphical lasso
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Sparse covariance thresholding for high-dimensional variable selection
- A note on the Lasso for Gaussian graphical model selection
- Efficient estimation of covariance selection models
Cites work
- scientific article; zbMATH DE number 469396 (Why is no real title available?)
- scientific article; zbMATH DE number 1134987 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A Statistical View of Some Chemometrics Regression Tools
- Asymptotics for Lasso-type estimators.
- Atomic decomposition by basis pursuit
- Dependency networks for inference, collaborative filtering, and data visualization
- Functional aggregation for nonparametric regression.
- Gaussian Markov distributions over finite graphs
- Introduction to Graphical Modelling
- Least angle regression. (With discussion)
- Linear Model Selection by Cross-Validation
- Model selection for Gaussian concentration graphs
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Weak convergence and empirical processes. With applications to statistics
Cited in
(only showing first 100 items - show all)- On constrained and regularized high-dimensional regression
- Non-negative least squares for high-dimensional linear models: consistency and sparse recovery without regularization
- Covariance estimation: the GLM and regularization perspectives
- Learning high-dimensional directed acyclic graphs with latent and selection variables
- On the residual empirical process based on the ALASSO in high dimensions and its functional oracle property
- UPS delivers optimal phase diagram in high-dimensional variable selection
- Model selection and estimation in the matrix normal graphical model
- A Bayesian approach to sparse dynamic network identification
- Nonnegative elastic net and application in index tracking
- Influence measures and stability for graphical models
- A note on the one-step estimator for ultrahigh dimensionality
- Nonnegative adaptive Lasso for ultra-high dimensional regression models and a two-stage method applied in financial modeling
- Network inference and biological dynamics
- Group selection in high-dimensional partially linear additive models
- Honest variable selection in linear and logistic regression models via \(\ell _{1}\) and \(\ell _{1}+\ell _{2}\) penalization
- On the asymptotic properties of the group lasso estimator for linear models
- PAC-Bayesian bounds for sparse regression estimation with exponential weights
- Adaptive cluster expansion for the inverse Ising problem: convergence, algorithm and tests
- Berry-Esseen bounds for estimating undirected graphs
- Confidence intervals for high-dimensional partially linear single-index models
- Nonparametric eigenvalue-regularized precision or covariance matrix estimator
- Simultaneous variable selection and estimation in semiparametric modeling of longitudinal/clustered data
- Strong oracle optimality of folded concave penalized estimation
- Joint high-dimensional Bayesian variable and covariance selection with an application to eQTL analysis
- Near-ideal model selection by \(\ell _{1}\) minimization
- Bayesian discriminant analysis using a high dimensional predictor
- Global solutions to folded concave penalized nonconvex learning
- Adaptive lasso for generalized linear models with a diverging number of parameters
- Adaptive Dantzig density estimation
- Large covariance estimation through elliptical factor models
- Banded regularization of autocovariance matrices in application to parameter estimation and forecasting of time series
- Minimum distance Lasso for robust high-dimensional regression
- Nonstationary Modeling With Sparsity for Spatial Data via the Basis Graphical Lasso
- Best subset selection, persistence in high-dimensional statistical learning and optimization under \(l_1\) constraint
- Covariate assisted screening and estimation
- Optimality of Graphlet Screening in High Dimensional Variable Selection
- Fixed support positive-definite modification of covariance matrix estimators via linear shrinkage
- Some theoretical results on the grouped variables Lasso
- High dimensional Gaussian copula graphical model with FDR control
- Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators
- Robust subspace clustering
- Adaptive Lasso in high-dimensional settings
- Strong consistency of Lasso estimators
- On the use of the Lasso for instrumental variables estimation with some invalid instruments
- An inexact interior point method for \(L_{1}\)-regularized sparse covariance selection
- Testing a single regression coefficient in high dimensional linear models
- Two tales of variable selection for high dimensional regression: Screening and model building
- Adaptive estimation of the baseline hazard function in the Cox model by model selection, with high-dimensional covariates
- Forest Garrote
- Joint estimation of precision matrices in heterogeneous populations
- On estimation of the diagonal elements of a sparse precision matrix
- Estimation of covariance matrix via the sparse Cholesky factor with lasso
- Quantile graphical models: a Bayesian approach
- Modeling dependent gene expression
- Fused multiple graphical lasso
- Oracle inequalities, variable selection and uniform inference in high-dimensional correlated random effects panel data models
- A rank-corrected procedure for matrix completion with fixed basis coefficients
- Support union recovery in high-dimensional multivariate regression
- Estimation of Positive Semidefinite Correlation Matrices by Using Convex Quadratic Semidefinite Programming
- Asymptotically honest confidence regions for high dimensional parameters by the desparsified conservative Lasso
- Group-wise semiparametric modeling: a SCSE approach
- A majorization-minimization approach to variable selection using spike and slab priors
- On confidence intervals for semiparametric expectile regression
- Semiparametric efficiency bounds for high-dimensional models
- A focused information criterion for graphical models in fMRI connectivity with high-dimensional data
- SPADES and mixture models
- Posterior convergence rates for estimating large precision matrices using graphical models
- Bayesian hyper-Lassos with non-convex penalization
- Oracle inequalities for the Lasso in the high-dimensional Aalen multiplicative intensity model
- A focused information criterion for graphical models
- Sparsistency and agnostic inference in sparse PCA
- Variable selection for panel count data via non-concave penalized estimating function
- A sparse Ising model with covariates
- \(\ell _{1}\)-regularized linear regression: persistence and oracle inequalities
- Self-concordant analysis for logistic regression
- Estimation and variable selection in partial linear single index models with error-prone linear covariates
- A new perspective on least squares under convex constraint
- Inferring gene-gene interactions and functional modules using sparse canonical correlation analysis
- ``Preconditioning for feature selection and regression in high-dimensional problems
- Focused vector information criterion model selection and model averaging regression with missing response
- Variable selection, monotone likelihood ratio and group sparsity
- Bayesian high-dimensional screening via MCMC
- Estimation of high-dimensional partially-observed discrete Markov random fields
- Probabilistic graphical models and Markov networks
- Dynamic networks with multi-scale temporal structure
- Adjusting for high-dimensional covariates in sparse precision matrix estimation by \(\ell_1\)-penalization
- Fast and adaptive sparse precision matrix estimation in high dimensions
- General nonexact oracle inequalities for classes with a subexponential envelope
- Mirror averaging with sparsity priors
- Semiparametric regression models with additive nonparametric components and high dimensional parametric components
- Generalization of constraints for high dimensional regression problems
- Sparse regression learning by aggregation and Langevin Monte-Carlo
- Estimating spatial covariance using penalised likelihood with weightedL1penalty
- High-dimensional Gaussian graphical model selection: walk summability and local separation criterion
- Estimation and variable selection with exponential weights
- Sharp support recovery from noisy random measurements by \(\ell_1\)-minimization
- Coordinate ascent for penalized semiparametric regression on high-dimensional panel count data
- Sparse covariance thresholding for high-dimensional variable selection
- Causal statistical inference in high dimensions
- Profiled adaptive elastic-net procedure for partially linear models with high-dimensional covar\-i\-ates
This page was built for publication: High-dimensional graphs and variable selection with the Lasso
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2500458)