High-dimensional graphs and variable selection with the Lasso
From MaRDI portal
Publication:2500458
Abstract: The pattern of zero entries in the inverse covariance matrix of a multivariate normal distribution corresponds to conditional independence restrictions between variables. Covariance selection aims at estimating those structural zeros from data. We show that neighborhood selection with the Lasso is a computationally attractive alternative to standard covariance selection for sparse high-dimensional graphs. Neighborhood selection estimates the conditional independence restrictions separately for each node in the graph and is hence equivalent to variable selection for Gaussian linear models. We show that the proposed neighborhood selection scheme is consistent for sparse high-dimensional graphs. Consistency hinges on the choice of the penalty parameter. The oracle value for optimal prediction does not lead to a consistent neighborhood estimate. Controlling instead the probability of falsely joining some distinct connectivity components of the graph, consistent estimation for sparse graphs is achieved (with exponential rates), even when the number of variables grows as the number of observations raised to an arbitrary power.
Recommendations
- Sparse inverse covariance estimation with the graphical lasso
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Sparse covariance thresholding for high-dimensional variable selection
- A note on the Lasso for Gaussian graphical model selection
- Efficient estimation of covariance selection models
Cites work
- scientific article; zbMATH DE number 469396 (Why is no real title available?)
- scientific article; zbMATH DE number 1134987 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A Statistical View of Some Chemometrics Regression Tools
- Asymptotics for Lasso-type estimators.
- Atomic decomposition by basis pursuit
- Dependency networks for inference, collaborative filtering, and data visualization
- Functional aggregation for nonparametric regression.
- Gaussian Markov distributions over finite graphs
- Introduction to Graphical Modelling
- Least angle regression. (With discussion)
- Linear Model Selection by Cross-Validation
- Model selection for Gaussian concentration graphs
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Weak convergence and empirical processes. With applications to statistics
Cited in
(only showing first 100 items - show all)- Focused vector information criterion model selection and model averaging regression with missing response
- Variable selection, monotone likelihood ratio and group sparsity
- Bayesian high-dimensional screening via MCMC
- Estimation of high-dimensional partially-observed discrete Markov random fields
- Probabilistic graphical models and Markov networks
- Dynamic networks with multi-scale temporal structure
- Adjusting for high-dimensional covariates in sparse precision matrix estimation by \(\ell_1\)-penalization
- Fast and adaptive sparse precision matrix estimation in high dimensions
- General nonexact oracle inequalities for classes with a subexponential envelope
- Mirror averaging with sparsity priors
- Semiparametric regression models with additive nonparametric components and high dimensional parametric components
- Generalization of constraints for high dimensional regression problems
- Sparse regression learning by aggregation and Langevin Monte-Carlo
- Estimating spatial covariance using penalised likelihood with weightedL1penalty
- High-dimensional Gaussian graphical model selection: walk summability and local separation criterion
- Estimation and variable selection with exponential weights
- Sharp support recovery from noisy random measurements by \(\ell_1\)-minimization
- Coordinate ascent for penalized semiparametric regression on high-dimensional panel count data
- Sparse covariance thresholding for high-dimensional variable selection
- Causal statistical inference in high dimensions
- Profiled adaptive elastic-net procedure for partially linear models with high-dimensional covar\-i\-ates
- Quadratic approximation on SCAD penalized estimation
- High-Dimensional Sparse Additive Hazards Regression
- An overview of recent developments in genomics and associated statistical methods
- Non-asymptotic oracle inequalities for the Lasso and group Lasso in high dimensional logistic model
- High dimensional change point inference: recent developments and extensions
- Classifier variability: accounting for training and testing
- Spatio-temporal random fields: compressible representation and distributed estimation
- Learning loopy graphical models with latent variables: efficient methods and guarantees
- Sparse estimation of high-dimensional inverse covariance matrices with explicit eigenvalue constraints
- Recovering networks from distance data
- Penalized profiled semiparametric estimating functions
- Multivariate Bernoulli distribution
- Fast global convergence of gradient methods for high-dimensional statistical recovery
- Rapid penalized likelihood-based outlier detection via heteroskedasticity test
- Multi-stage convex relaxation for feature selection
- Exact covariance thresholding into connected components for large-scale graphical lasso
- Goodness-of-fit tests for high-dimensional Gaussian linear models
- Sparse inverse kernel Gaussian Process regression
- Sparsity in penalized empirical risk minimization
- A note on the asymptotic distribution of lasso estimator for correlated data
- High-dimensional analysis of semidefinite relaxations for sparse principal components
- Nonconcave penalized composite conditional likelihood estimation of sparse Ising models
- CAM: causal additive models, high-dimensional order search and penalized regression
- Bayesian sparse graphical models for classification with application to protein expression data
- High-dimensional Bayesian inference in nonparametric additive models
- Asymptotic properties of Lasso+mLS and Lasso+Ridge in sparse high-dimensional linear regression
- Transductive versions of the Lasso and the Dantzig selector
- Parametric or nonparametric? A parametricness index for model selection
- A joint convex penalty for inverse covariance matrix estimation
- Time varying undirected graphs
- High-Dimensional Gaussian Graphical Regression Models with Covariates
- Regularization and variable selection in Heckman selection model
- Structured Lasso for regression with matrix covariates
- Consistent group selection in high-dimensional linear regression
- Covariance and precision matrix estimation for high-dimensional time series
- Penalized estimation in high-dimensional hidden Markov models with state-specific graphical models
- DASSO: Connections Between the Dantzig Selector and Lasso
- Variable selection in high-dimensional quantile varying coefficient models
- High-dimensional Ising model selection with Bayesian information criteria
- High-dimensional covariance matrix estimation with missing observations
- Modeling item-item similarities for personalized recommendations on Yahoo! front page
- Flexible covariance estimation in graphical Gaussian models
- Estimation of high-dimensional graphical models using regularized score matching
- Estimation of high-dimensional low-rank matrices
- Variable selection in nonparametric additive models
- Statistics for big data: a perspective
- Preconditioning the Lasso for sign consistency
- Oracle inequalities for the lasso in the Cox model
- Flexible and Interpretable Models for Survival Data
- Worst possible sub-directions in high-dimensional models
- Some sharp performance bounds for least squares regression with \(L_1\) regularization
- Structural pursuit over multiple undirected graphs
- Spike-and-Slab Group Lassos for Grouped Regression and Sparse Generalized Additive Models
- Stability
- A general theory of concave regularization for high-dimensional sparse estimation problems
- High-dimensional regression with unknown variance
- Gemini: graph estimation with matrix variate normal instances
- Review of statistical network analysis: models, algorithms, and software
- Variable selection for high dimensional multivariate outcomes
- Learning Sparse Causal Gaussian Networks With Experimental Intervention: Regularization and Coordinate Descent
- Relaxed Lasso
- Latent variable graphical model selection via convex optimization
- Conditional score matching for high-dimensional partial graphical models
- A selective review of group selection in high-dimensional models
- Higher criticism for large-scale inference, especially for rare and weak effects
- High-dimensional variable screening and bias in subsequent inference, with an empirical comparison
- Discussion: Latent variable graphical model selection via convex optimization
- Consistent high-dimensional Bayesian variable selection via penalized credible regions
- Estimating time-varying networks
- Likelihood-based selection and sharp parameter estimation
- Hypothesis Testing of Matrix Graph Model with Application to Brain Connectivity Analysis
- Estimation in high-dimensional linear models with deterministic design matrices
- High-dimensional regression in practice: an empirical study of finite-sample prediction, variable selection and ranking
- An Expectation Conditional Maximization Approach for Gaussian Graphical Models
- Tuning-free heterogeneous inference in massive networks
- Sparse Matrix Graphical Models
- Bootstrap inference for network construction with an application to a breast cancer microarray study
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- Robust regression through the Huber's criterion and adaptive lasso penalty
This page was built for publication: High-dimensional graphs and variable selection with the Lasso
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2500458)