High-dimensional graphs and variable selection with the Lasso
DOI10.1214/009053606000000281zbMATH Open1113.62082arXivmath/0608017OpenAlexW3098834468WikidataQ105584248 ScholiaQ105584248MaRDI QIDQ2500458FDOQ2500458
Authors: Nicolai Meinshausen, Peter Bühlmann
Publication date: 24 August 2006
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/math/0608017
Recommendations
- Sparse inverse covariance estimation with the graphical lasso
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Sparse covariance thresholding for high-dimensional variable selection
- A note on the Lasso for Gaussian graphical model selection
- Efficient estimation of covariance selection models
Asymptotic properties of parametric estimators (62F12) Multivariate analysis (62H99) Linear regression; mixed models (62J05) Applications of graph theory (05C90) Estimation in multivariate analysis (62H12) Ridge regression; shrinkage estimators (Lasso) (62J07)
Cites Work
- Introduction to Graphical Modelling
- Least angle regression. (With discussion)
- Weak convergence and empirical processes. With applications to statistics
- Title not available (Why is that?)
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Asymptotics for Lasso-type estimators.
- Title not available (Why is that?)
- Linear Model Selection by Cross-Validation
- A Statistical View of Some Chemometrics Regression Tools
- Gaussian Markov distributions over finite graphs
- Functional aggregation for nonparametric regression.
- Model selection for Gaussian concentration graphs
- Atomic decomposition by basis pursuit
- Dependency networks for inference, collaborative filtering, and data visualization
- Title not available (Why is that?)
Cited In (only showing first 100 items - show all)
- Adaptive estimation of the baseline hazard function in the Cox model by model selection, with high-dimensional covariates
- Estimation and variable selection in partial linear single index models with error-prone linear covariates
- A focused information criterion for graphical models in fMRI connectivity with high-dimensional data
- Covariance estimation: the GLM and regularization perspectives
- Learning high-dimensional directed acyclic graphs with latent and selection variables
- UPS delivers optimal phase diagram in high-dimensional variable selection
- Covariate assisted screening and estimation
- Robust subspace clustering
- Group selection in high-dimensional partially linear additive models
- Adaptive Dantzig density estimation
- Influence measures and stability for graphical models
- Nonnegative adaptive Lasso for ultra-high dimensional regression models and a two-stage method applied in financial modeling
- An inexact interior point method for \(L_{1}\)-regularized sparse covariance selection
- Non-negative least squares for high-dimensional linear models: consistency and sparse recovery without regularization
- Berry-Esseen bounds for estimating undirected graphs
- Some theoretical results on the grouped variables Lasso
- Adaptive Lasso in high-dimensional settings
- A focused information criterion for graphical models
- Global solutions to folded concave penalized nonconvex learning
- Minimum distance Lasso for robust high-dimensional regression
- Best subset selection, persistence in high-dimensional statistical learning and optimization under \(l_1\) constraint
- Testing a single regression coefficient in high dimensional linear models
- A majorization-minimization approach to variable selection using spike and slab priors
- On constrained and regularized high-dimensional regression
- Confidence intervals for high-dimensional partially linear single-index models
- Nonparametric eigenvalue-regularized precision or covariance matrix estimator
- Oracle inequalities for the Lasso in the high-dimensional Aalen multiplicative intensity model
- Model selection and estimation in the matrix normal graphical model
- Adaptive cluster expansion for the inverse Ising problem: convergence, algorithm and tests
- Near-ideal model selection by \(\ell _{1}\) minimization
- ``Preconditioning for feature selection and regression in high-dimensional problems
- Joint estimation of precision matrices in heterogeneous populations
- On estimation of the diagonal elements of a sparse precision matrix
- Variable selection for panel count data via non-concave penalized estimating function
- Network inference and biological dynamics
- Posterior convergence rates for estimating large precision matrices using graphical models
- \(\ell _{1}\)-regularized linear regression: persistence and oracle inequalities
- A new perspective on least squares under convex constraint
- On the residual empirical process based on the ALASSO in high dimensions and its functional oracle property
- Honest variable selection in linear and logistic regression models via \(\ell _{1}\) and \(\ell _{1}+\ell _{2}\) penalization
- On the asymptotic properties of the group lasso estimator for linear models
- PAC-Bayesian bounds for sparse regression estimation with exponential weights
- Oracle inequalities, variable selection and uniform inference in high-dimensional correlated random effects panel data models
- A rank-corrected procedure for matrix completion with fixed basis coefficients
- Nonstationary Modeling With Sparsity for Spatial Data via the Basis Graphical Lasso
- Modeling dependent gene expression
- On confidence intervals for semiparametric expectile regression
- SPADES and mixture models
- Simultaneous variable selection and estimation in semiparametric modeling of longitudinal/clustered data
- Two tales of variable selection for high dimensional regression: Screening and model building
- Group-wise semiparametric modeling: a SCSE approach
- Bayesian discriminant analysis using a high dimensional predictor
- Banded regularization of autocovariance matrices in application to parameter estimation and forecasting of time series
- Estimation of covariance matrix via the sparse Cholesky factor with lasso
- Fused multiple graphical lasso
- A Bayesian approach to sparse dynamic network identification
- Nonnegative elastic net and application in index tracking
- Large covariance estimation through elliptical factor models
- Optimality of Graphlet Screening in High Dimensional Variable Selection
- Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators
- A note on the one-step estimator for ultrahigh dimensionality
- Self-concordant analysis for logistic regression
- Strong consistency of Lasso estimators
- Bayesian hyper-Lassos with non-convex penalization
- Strong oracle optimality of folded concave penalized estimation
- Joint high-dimensional Bayesian variable and covariance selection with an application to eQTL analysis
- High dimensional Gaussian copula graphical model with FDR control
- Asymptotically honest confidence regions for high dimensional parameters by the desparsified conservative Lasso
- Fixed support positive-definite modification of covariance matrix estimators via linear shrinkage
- A sparse Ising model with covariates
- Forest Garrote
- Inferring gene-gene interactions and functional modules using sparse canonical correlation analysis
- Adaptive lasso for generalized linear models with a diverging number of parameters
- Quantile graphical models: a Bayesian approach
- Support union recovery in high-dimensional multivariate regression
- On the use of the Lasso for instrumental variables estimation with some invalid instruments
- Estimation of Positive Semidefinite Correlation Matrices by Using Convex Quadratic Semidefinite Programming
- Semiparametric efficiency bounds for high-dimensional models
- Sparsistency and agnostic inference in sparse PCA
- Estimating time-varying networks
- Some sharp performance bounds for least squares regression with \(L_1\) regularization
- Hypothesis Testing of Matrix Graph Model with Application to Brain Connectivity Analysis
- Flexible covariance estimation in graphical Gaussian models
- Statistics for big data: a perspective
- Structured Lasso for regression with matrix covariates
- Consistent high-dimensional Bayesian variable selection via penalized credible regions
- Likelihood-based selection and sharp parameter estimation
- High-dimensional inference for personalized treatment decision
- Layer-wise learning strategy for nonparametric tensor product smoothing spline regression and graphical models
- Discussion: Latent variable graphical model selection via convex optimization
- Bootstrap inference for network construction with an application to a breast cancer microarray study
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- Robust regression through the Huber's criterion and adaptive lasso penalty
- Optimal detection of heterogeneous and heteroscedastic mixtures
- Robust Gaussian graphical modeling via \(l_{1}\) penalization
- Stable graphical model estimation with random forests for discrete, continuous, and mixed variables
- Structural pursuit over multiple undirected graphs
- Tuning-free heterogeneous inference in massive networks
- Fitting very large sparse Gaussian graphical models
- Simultaneous inference for pairwise graphical models with generalized score matching
Uses Software
This page was built for publication: High-dimensional graphs and variable selection with the Lasso
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2500458)