High-dimensional graphs and variable selection with the Lasso
From MaRDI portal
Publication:2500458
Abstract: The pattern of zero entries in the inverse covariance matrix of a multivariate normal distribution corresponds to conditional independence restrictions between variables. Covariance selection aims at estimating those structural zeros from data. We show that neighborhood selection with the Lasso is a computationally attractive alternative to standard covariance selection for sparse high-dimensional graphs. Neighborhood selection estimates the conditional independence restrictions separately for each node in the graph and is hence equivalent to variable selection for Gaussian linear models. We show that the proposed neighborhood selection scheme is consistent for sparse high-dimensional graphs. Consistency hinges on the choice of the penalty parameter. The oracle value for optimal prediction does not lead to a consistent neighborhood estimate. Controlling instead the probability of falsely joining some distinct connectivity components of the graph, consistent estimation for sparse graphs is achieved (with exponential rates), even when the number of variables grows as the number of observations raised to an arbitrary power.
Recommendations
- Sparse inverse covariance estimation with the graphical lasso
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Sparse covariance thresholding for high-dimensional variable selection
- A note on the Lasso for Gaussian graphical model selection
- Efficient estimation of covariance selection models
Cites work
- scientific article; zbMATH DE number 469396 (Why is no real title available?)
- scientific article; zbMATH DE number 1134987 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A Statistical View of Some Chemometrics Regression Tools
- Asymptotics for Lasso-type estimators.
- Atomic decomposition by basis pursuit
- Dependency networks for inference, collaborative filtering, and data visualization
- Functional aggregation for nonparametric regression.
- Gaussian Markov distributions over finite graphs
- Introduction to Graphical Modelling
- Least angle regression. (With discussion)
- Linear Model Selection by Cross-Validation
- Model selection for Gaussian concentration graphs
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Weak convergence and empirical processes. With applications to statistics
Cited in
(only showing first 100 items - show all)- Sparse estimation of conditional graphical models with application to gene networks
- Hierarchical inference for genome-wide association studies: a view on methodology with software
- A unified framework for structured graph learning via spectral constraints
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Boosting algorithms: regularization, prediction and model fitting
- Regularization for Cox's proportional hazards model with NP-dimensionality
- Adaptive Lasso estimators for ultrahigh dimensional generalized linear models
- Group symmetry and covariance regularization
- High dimensional sparse covariance estimation via directed acyclic graphs
- Sparse permutation invariant covariance estimation
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- Sparse nonparametric graphical models
- Stability Selection
- Model selection for factorial Gaussian graphical models with an application to dynamic regulatory networks
- \(\ell_{1}\)-penalization for mixture regression models
- A constrained \(\ell1\) minimization approach for estimating multiple sparse Gaussian or nonparanormal graphical models
- High-dimensional simultaneous inference with the bootstrap
- Factor models and variable selection in high-dimensional regression analysis
- Sparsistency and rates of convergence in large covariance matrix estimation
- High-dimensional additive modeling
- Asymptotic normality and optimalities in estimation of large Gaussian graphical models
- Gaussian graphical model estimation with false discovery rate control
- Honest confidence regions and optimality in high-dimensional precision matrix estimation
- Graphical models for zero-inflated single cell gene expression
- Tests for Gaussian graphical models
- Inferring large graphs using \(\ell_1\)-penalized likelihood
- Sure independence screening for ultrahigh dimensional feature space. With discussion and authors' reply
- Sparse directed acyclic graphs incorporating the covariates
- Large covariance estimation by thresholding principal orthogonal complements. With discussion and authors' reply
- Confidence intervals for low dimensional parameters in high dimensional linear models
- Partial correlation estimation by joint sparse regression models
- High-dimensional regression with noisy and missing data: provable guarantees with nonconvexity
- Robust graphical modeling of gene networks using classical and alternative \(t\)-distributions
- Correlated variables in regression: clustering and sparse estimation
- Bayesian structure learning in sparse Gaussian graphical models
- Simultaneous analysis of Lasso and Dantzig selector
- Bayesian model selection approach for coloured graphical Gaussian models
- Oracle inequalities for high dimensional vector autoregressions
- ROCKET: robust confidence intervals via Kendall's tau for transelliptical graphical models
- Covariance-regularized regression and classification for high dimensional problems
- Best subset selection via a modern optimization lens
- Lasso-type recovery of sparse representations for high-dimensional data
- A sparse conditional Gaussian graphical model for analysis of genetical genomics data
- Missing values: sparse inverse covariance estimation and an extension to sparse regression
- A note on the Lasso for Gaussian graphical model selection
- Nearly unbiased variable selection under minimax concave penalty
- Fast Computation of Latent Correlations
- A general algorithm for covariance modeling of discrete data
- High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}
- Graph selection with GGMselect
- Robust sparse Gaussian graphical modeling
- Rates of convergence of the adaptive LASSO estimators to the oracle distribution and higher order refinements by the bootstrap
- Statistical significance in high-dimensional linear models
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- A projection-based conditional dependence measure with applications to high-dimensional undirected graphical models
- Covariate Selection in High-Dimensional Generalized Linear Models With Measurement Error
- High-dimensional semiparametric Gaussian copula graphical models
- On principal graphical models with application to gene network
- Simultaneous multiple response regression and inverse covariance matrix estimation via penalized Gaussian maximum likelihood
- Factor-Adjusted Regularized Model Selection
- Sparse inverse covariance estimation with the graphical lasso
- Change-point detection in high-dimensional covariance structure
- Bayesian graphical models for differential pathways
- Regularized estimation of large covariance matrices
- Performance bounds for parameter estimates of high-dimensional linear models with correlated errors
- Quasi-likelihood and/or robust estimation in high dimensions
- Structured sparsity through convex optimization
- Sparse recovery under matrix uncertainty
- Multiple testing and error control in Gaussian graphical model selection
- Lasso-driven inference in time and space
- Inferring sparse Gaussian graphical models with latent structure
- On the conditions used to prove oracle results for the Lasso
- Regularized rank-based estimation of high-dimensional nonparanormal graphical models
- Variable selection and regression analysis for graph-structured covariates with an application to genomics
- High-dimensional variable selection
- Sparse semiparametric discriminant analysis
- The graphical lasso: new insights and alternatives
- High-dimensional generalized linear models and the lasso
- Estimation of sparse directed acyclic graphs for multivariate counts data
- Penalised inference for lagged dependent regression in the presence of autocorrelated residuals
- Network exploration via the adaptive LASSO and SCAD penalties
- High-dimensional Ising model selection using \(\ell _{1}\)-regularized logistic regression
- Least angle and \(\ell _{1}\) penalized regression: a review
- Exact test theory in Gaussian graphical models
- Model selection through sparse maximum likelihood estimation for multivariate Gaussian or binary data
- High-dimensional change-point estimation: combining filtering with convex optimization
- Maximum Likelihood Estimation Over Directed Acyclic Gaussian Graphs
- Inferring multiple graphical structures
- On stepwise pattern recovery of the fused Lasso
- Confidence intervals for high-dimensional inverse covariance estimation
- Graph-guided banding of the covariance matrix
- The sparse Laplacian shrinkage estimator for high-dimensional regression
- SCAD-penalized regression in high-dimensional partially linear models
- The log-linear group-lasso estimator and its asymptotic properties
- On constrained and regularized high-dimensional regression
- Non-negative least squares for high-dimensional linear models: consistency and sparse recovery without regularization
- Covariance estimation: the GLM and regularization perspectives
- Learning high-dimensional directed acyclic graphs with latent and selection variables
- On the residual empirical process based on the ALASSO in high dimensions and its functional oracle property
- UPS delivers optimal phase diagram in high-dimensional variable selection
This page was built for publication: High-dimensional graphs and variable selection with the Lasso
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2500458)