High-dimensional graphs and variable selection with the Lasso
From MaRDI portal
Publication:2500458
Abstract: The pattern of zero entries in the inverse covariance matrix of a multivariate normal distribution corresponds to conditional independence restrictions between variables. Covariance selection aims at estimating those structural zeros from data. We show that neighborhood selection with the Lasso is a computationally attractive alternative to standard covariance selection for sparse high-dimensional graphs. Neighborhood selection estimates the conditional independence restrictions separately for each node in the graph and is hence equivalent to variable selection for Gaussian linear models. We show that the proposed neighborhood selection scheme is consistent for sparse high-dimensional graphs. Consistency hinges on the choice of the penalty parameter. The oracle value for optimal prediction does not lead to a consistent neighborhood estimate. Controlling instead the probability of falsely joining some distinct connectivity components of the graph, consistent estimation for sparse graphs is achieved (with exponential rates), even when the number of variables grows as the number of observations raised to an arbitrary power.
Recommendations
- Sparse inverse covariance estimation with the graphical lasso
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Sparse covariance thresholding for high-dimensional variable selection
- A note on the Lasso for Gaussian graphical model selection
- Efficient estimation of covariance selection models
Cites work
- scientific article; zbMATH DE number 469396 (Why is no real title available?)
- scientific article; zbMATH DE number 1134987 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A Statistical View of Some Chemometrics Regression Tools
- Asymptotics for Lasso-type estimators.
- Atomic decomposition by basis pursuit
- Dependency networks for inference, collaborative filtering, and data visualization
- Functional aggregation for nonparametric regression.
- Gaussian Markov distributions over finite graphs
- Introduction to Graphical Modelling
- Least angle regression. (With discussion)
- Linear Model Selection by Cross-Validation
- Model selection for Gaussian concentration graphs
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Weak convergence and empirical processes. With applications to statistics
Cited in
(only showing first 100 items - show all)- A two-stage sequential conditional selection approach to sparse high-dimensional multivariate regression models
- A Unified Framework for Change Point Detection in High-Dimensional Linear Models
- A review of Gaussian Markov models for conditional independence
- Change-Point Detection for Graphical Models in the Presence of Missing Values
- Computational implications of reducing data to sufficient statistics
- Compound Poisson models for weighted networks with applications in finance
- Learning high-dimensional Gaussian linear structural equation models with heterogeneous error variances
- Promote sign consistency in the joint estimation of precision matrices
- Learning quadratic variance function (QVF) DAG models via overdispersion scoring (ODS)
- Globally Adaptive Longitudinal Quantile Regression With High Dimensional Compositional Covariates
- Doubly debiased Lasso: high-dimensional inference under hidden confounding
- Robust regression via mutivariate regression depth
- Generalized score matching for non-negative data
- Distributed testing and estimation under sparse high dimensional models
- Variable selection for semiparametric regression models with iterated penalisation
- Bayesian graphical regression
- Estimating Time-Varying Graphical Models
- Estimation of the inverse scatter matrix of an elliptically symmetric distribution
- Bayesian regularization for graphical models with unequal shrinkage
- High-dimensional Cox models: the choice of penalty as part of the model building process
- Network assisted analysis to reveal the genetic basis of autism
- Maximum-type tests for high-dimensional regression coefficients using Wilcoxon scores
- High-dimensional Poisson structural equation model learning via \(\ell_1\)-regularized regression
- Debiasing the Lasso: optimal sample size for Gaussian designs
- Objective Bayesian search of Gaussian directed acyclic graphical models for ordered variables with non-local priors
- Confidence intervals for sparse penalized regression with random designs
- Spectral analysis of high-dimensional time series
- Broken adaptive ridge regression and its asymptotic properties
- An efficient algorithm for sparse inverse covariance matrix estimation based on dual formulation
- Variable selection in functional regression models: a review
- Estimating a common covariance matrix for network meta-analysis of gene expression datasets in diffuse large B-cell lymphoma
- Nonparametric Bayesian learning of heterogeneous dynamic transcription factor networks
- Large-scale multivariate sparse regression with applications to UK Biobank
- A tuning-free robust and efficient approach to high-dimensional regression
- Model selection consistency of Lasso for empirical data
- The cluster graphical Lasso for improved estimation of Gaussian graphical models
- The degrees of freedom of partly smooth regularizers
- High-dimensional linear model selection motivated by multiple testing
- Covariance-based sample selection for heterogeneous data: applications to gene expression and autism risk gene detection
- A general family of trimmed estimators for robust high-dimensional data analysis
- Confidence intervals for high-dimensional Cox models
- Region selection in Markov random fields: Gaussian case
- Sparse Laplacian shrinkage with the graphical Lasso estimator for regression problems
- Bayesian structure learning in graphical models
- Dynamic and robust Bayesian graphical models
- Network-based discriminant analysis for multiclassification
- Sparse and low-rank matrix regularization for learning time-varying Markov networks
- Globally adaptive quantile regression with ultra-high dimensional data
- Simultaneous feature selection and clustering based on square root optimization
- High-dimensional sparse portfolio selection with nonnegative constraint
- Exact estimation of multiple directed acyclic graphs
- REMI: REGRESSION WITH MARGINAL INFORMATION AND ITS APPLICATION IN GENOME-WIDE ASSOCIATION STUDIES
- Efficient Bayesian regularization for graphical model selection
- An efficient ADMM algorithm for high dimensional precision matrix estimation via penalized quadratic loss
- Bayesian hypothesis testing for Gaussian graphical models: conditional independence and order constraints
- Recovery of partly sparse and dense signals
- Covariance structure approximation via gLasso in high-dimensional supervised classification
- Inference for Nonparanormal Partial Correlation via Regularized Rank-Based Nodewise Regression
- Estimation of positive definite \(M\)-matrices and structure learning for attractive Gaussian Markov random fields
- Sparse high-dimensional linear regression. Estimating squared error and a phase transition
- Treelets -- an adaptive multi-scale basis for sparse unordered data
- Operator-valued kernel-based vector autoregressive models for network inference
- Generalized Kalman smoothing: modeling and algorithms
- High dimensional posterior convergence rates for decomposable graphical models
- The Dantzig selector: recovery of signal via ℓ 1 − αℓ 2 minimization
- Discussion: Latent variable graphical model selection via convex optimization
- Sparse regression: scalable algorithms and empirical performance
- Copula Gaussian Graphical Models for Functional Data
- Gaussian and bootstrap approximations for high-dimensional U-statistics and their applications
- Regularization and the small-ball method. I: Sparse recovery
- Discussion: One-step sparse estimates in nonconcave penalized likelihood models
- A shrinkage principle for heavy-tailed data: high-dimensional robust low-rank matrix recovery
- Fitting sparse linear models under the sufficient and necessary condition for model identification
- Central limit theorem for linear spectral statistics of general separable sample covariance matrices with applications
- Sequential Lasso cum EBIC for feature selection with ultra-high dimensional feature space
- Interpreting latent variables in factor models via convex optimization
- Nonparametric and high-dimensional functional graphical models
- Estimating heterogeneous graphical models for discrete data with an application to roll call voting
- Learning relational dependency networks in hybrid domains
- Necessary and sufficient conditions for variable selection consistency of the Lasso in high dimensions
- Poisson dependency networks: gradient boosted models for multivariate count data
- The smooth-Lasso and other \(\ell _{1}+\ell _{2}\)-penalized methods
- On path restoration for censored outcomes
- A greedy feature selection algorithm for big data of high dimensionality
- Leave-one-out cross-validation is risk consistent for Lasso
- The performance of covariance selection methods that consider decomposable models only
- A loss‐based prior for Gaussian graphical models
- Laplace error penalty-based variable selection in high dimension
- Estimating finite mixtures of ordinal graphical models
- Block-diagonal precision matrix regularization for ultra-high dimensional data
- Data science, big data and statistics
- Learning Gaussian graphical models with fractional marginal pseudo-likelihood
- Robust methods for inferring sparse network structures
- A sequential scaled pairwise selection approach to edge detection in nonparanormal graphical models
- Robust machine learning by median-of-means: theory and practice
- On nonparametric feature filters in electromagnetic imaging
- Learning semidefinite regularizers
- Model selection and local geometry
- Sorted concave penalized regression
- Individual-specific, sparse inverse covariance estimation in generalized estimating equations
This page was built for publication: High-dimensional graphs and variable selection with the Lasso
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2500458)