High-dimensional graphs and variable selection with the Lasso
DOI10.1214/009053606000000281zbMATH Open1113.62082arXivmath/0608017OpenAlexW3098834468WikidataQ105584248 ScholiaQ105584248MaRDI QIDQ2500458FDOQ2500458
Authors: Nicolai Meinshausen, Peter Bühlmann
Publication date: 24 August 2006
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/math/0608017
Recommendations
- Sparse inverse covariance estimation with the graphical lasso
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Sparse covariance thresholding for high-dimensional variable selection
- A note on the Lasso for Gaussian graphical model selection
- Efficient estimation of covariance selection models
Asymptotic properties of parametric estimators (62F12) Multivariate analysis (62H99) Linear regression; mixed models (62J05) Applications of graph theory (05C90) Estimation in multivariate analysis (62H12) Ridge regression; shrinkage estimators (Lasso) (62J07)
Cites Work
- Introduction to Graphical Modelling
- Least angle regression. (With discussion)
- Weak convergence and empirical processes. With applications to statistics
- Title not available (Why is that?)
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Asymptotics for Lasso-type estimators.
- Title not available (Why is that?)
- Linear Model Selection by Cross-Validation
- A Statistical View of Some Chemometrics Regression Tools
- Gaussian Markov distributions over finite graphs
- Functional aggregation for nonparametric regression.
- Model selection for Gaussian concentration graphs
- Atomic decomposition by basis pursuit
- Dependency networks for inference, collaborative filtering, and data visualization
- Title not available (Why is that?)
Cited In (only showing first 100 items - show all)
- A shrinkage principle for heavy-tailed data: high-dimensional robust low-rank matrix recovery
- Fitting sparse linear models under the sufficient and necessary condition for model identification
- Central limit theorem for linear spectral statistics of general separable sample covariance matrices with applications
- Maximum-type tests for high-dimensional regression coefficients using Wilcoxon scores
- Globally adaptive quantile regression with ultra-high dimensional data
- Recovery of partly sparse and dense signals
- Shrinkage tuning parameter selection in precision matrices estimation
- Sparse regression: scalable algorithms and empirical performance
- The smooth-Lasso and other \(\ell _{1}+\ell _{2}\)-penalized methods
- Block-diagonal precision matrix regularization for ultra-high dimensional data
- Data science, big data and statistics
- Model selection and local geometry
- Sorted concave penalized regression
- Learning semidefinite regularizers
- Gaussian and bootstrap approximations for high-dimensional U-statistics and their applications
- Regularization and the small-ball method. I: Sparse recovery
- Generalized Kalman smoothing: modeling and algorithms
- Discussion: Latent variable graphical model selection via convex optimization
- Necessary and sufficient conditions for variable selection consistency of the Lasso in high dimensions
- Bayesian graphical models for modern biological applications
- Robust machine learning by median-of-means: theory and practice
- Quasi-Bayesian estimation of large Gaussian graphical models
- Estimation of positive definite \(M\)-matrices and structure learning for attractive Gaussian Markov random fields
- Structural learning for Bayesian networks by testing complete separators in prime blocks
- Leave-one-out cross-validation is risk consistent for Lasso
- The Dantzig selector: recovery of signal via ℓ 1 − αℓ 2 minimization
- Estimating finite mixtures of ordinal graphical models
- A sequential scaled pairwise selection approach to edge detection in nonparanormal graphical models
- Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation
- Covariance structure approximation via gLasso in high-dimensional supervised classification
- Sparse high-dimensional linear regression. Estimating squared error and a phase transition
- Estimating heterogeneous graphical models for discrete data with an application to roll call voting
- Learning relational dependency networks in hybrid domains
- Poisson dependency networks: gradient boosted models for multivariate count data
- Sequential Lasso Cum EBIC for Feature Selection With Ultra-High Dimensional Feature Space
- High-dimensional rank-based graphical models for non-Gaussian functional data
- Inference for Nonparanormal Partial Correlation via Regularized Rank-Based Nodewise Regression
- High-dimensional Gaussian model selection on a Gaussian design
- High dimensional posterior convergence rates for decomposable graphical models
- Edge detection in sparse Gaussian graphical models
- Solving norm constrained portfolio optimization via coordinate-wise descent algorithms
- Discussion: One-step sparse estimates in nonconcave penalized likelihood models
- An efficient method for identifying statistical interactors in gene association networks
- Copula Gaussian Graphical Models for Functional Data
- Nonparametric and high-dimensional functional graphical models
- On nonparametric feature filters in electromagnetic imaging
- Individual-specific, sparse inverse covariance estimation in generalized estimating equations
- Robust estimation of sparse precision matrix using adaptive weighted graphical lasso approach
- Accelerating a Gibbs sampler for variable selection on genomics data with summarization and variable pre-selection combining an array DBMS and R
- Title not available (Why is that?)
- Topological techniques in model selection
- Bayesian hypothesis testing for Gaussian graphical models: conditional independence and order constraints
- The performance of covariance selection methods that consider decomposable models only
- Alternating Direction Methods for Latent Variable Gaussian Graphical Model Selection
- A greedy feature selection algorithm for big data of high dimensionality
- Learning Gaussian graphical models with fractional marginal pseudo-likelihood
- Bayesian analysis of nonparanormal graphical models using rank-likelihood
- A loss‐based prior for Gaussian graphical models
- Interpreting latent variables in factor models via convex optimization
- On path restoration for censored outcomes
- Laplace Error Penalty-based Variable Selection in High Dimension
- Estimating time-varying networks
- Adaptive estimation of the baseline hazard function in the Cox model by model selection, with high-dimensional covariates
- Some sharp performance bounds for least squares regression with \(L_1\) regularization
- Hypothesis Testing of Matrix Graph Model with Application to Brain Connectivity Analysis
- Covariance estimation: the GLM and regularization perspectives
- Learning high-dimensional directed acyclic graphs with latent and selection variables
- UPS delivers optimal phase diagram in high-dimensional variable selection
- Flexible covariance estimation in graphical Gaussian models
- Covariate assisted screening and estimation
- Statistics for big data: a perspective
- Consistent high-dimensional Bayesian variable selection via penalized credible regions
- Likelihood-based selection and sharp parameter estimation
- High-dimensional inference for personalized treatment decision
- Discussion: Latent variable graphical model selection via convex optimization
- Bootstrap inference for network construction with an application to a breast cancer microarray study
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- Robust regression through the Huber's criterion and adaptive lasso penalty
- Optimal detection of heterogeneous and heteroscedastic mixtures
- Robust Gaussian graphical modeling via \(l_{1}\) penalization
- Stable graphical model estimation with random forests for discrete, continuous, and mixed variables
- Fitting very large sparse Gaussian graphical models
- Rejoinder: Latent variable graphical model selection via convex optimization
- Estimation of Gaussian graphs by model selection
- The adaptive and the thresholded Lasso for potentially misspecified models (and a lower bound for the Lasso)
- Time varying undirected graphs
- Graphical-model based high dimensional generalized linear models
- High-dimensional regression in practice: an empirical study of finite-sample prediction, variable selection and ranking
- Discussion: Latent variable graphical model selection via convex optimization
- High-dimensional joint estimation of multiple directed Gaussian graphical models
- A two-step method for estimating high-dimensional Gaussian graphical models
- Gemini: graph estimation with matrix variate normal instances
- High-dimensional variable screening and bias in subsequent inference, with an empirical comparison
- Structure estimation for discrete graphical models: generalized covariance matrices and their inverses
- Bayesian variable selection for high dimensional generalized linear models: convergence rates of the fitted densities
- Regularization and variable selection in Heckman selection model
- High-dimensional Ising model selection with Bayesian information criteria
- Variable selection in model-based clustering and discriminant analysis with a regularization approach
- Covariance and precision matrix estimation for high-dimensional time series
- Penalized estimation in high-dimensional hidden Markov models with state-specific graphical models
Uses Software
This page was built for publication: High-dimensional graphs and variable selection with the Lasso
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2500458)