High-dimensional graphs and variable selection with the Lasso
DOI10.1214/009053606000000281zbMATH Open1113.62082arXivmath/0608017OpenAlexW3098834468WikidataQ105584248 ScholiaQ105584248MaRDI QIDQ2500458FDOQ2500458
Authors: Nicolai Meinshausen, Peter Bühlmann
Publication date: 24 August 2006
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/math/0608017
Recommendations
- Sparse inverse covariance estimation with the graphical lasso
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Sparse covariance thresholding for high-dimensional variable selection
- A note on the Lasso for Gaussian graphical model selection
- Efficient estimation of covariance selection models
Asymptotic properties of parametric estimators (62F12) Multivariate analysis (62H99) Linear regression; mixed models (62J05) Applications of graph theory (05C90) Estimation in multivariate analysis (62H12) Ridge regression; shrinkage estimators (Lasso) (62J07)
Cites Work
- Introduction to Graphical Modelling
- Least angle regression. (With discussion)
- Weak convergence and empirical processes. With applications to statistics
- Title not available (Why is that?)
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Asymptotics for Lasso-type estimators.
- Title not available (Why is that?)
- Linear Model Selection by Cross-Validation
- A Statistical View of Some Chemometrics Regression Tools
- Gaussian Markov distributions over finite graphs
- Functional aggregation for nonparametric regression.
- Model selection for Gaussian concentration graphs
- Atomic decomposition by basis pursuit
- Dependency networks for inference, collaborative filtering, and data visualization
- Title not available (Why is that?)
Cited In (only showing first 100 items - show all)
- Probabilistic graphical models and Markov networks
- Adjusting for high-dimensional covariates in sparse precision matrix estimation by \(\ell_1\)-penalization
- Spatio-temporal random fields: compressible representation and distributed estimation
- Parametric or nonparametric? A parametricness index for model selection
- High-dimensional Gaussian graphical model selection: walk summability and local separation criterion
- Causal statistical inference in high dimensions
- Classifier variability: accounting for training and testing
- Multi-stage convex relaxation for feature selection
- Non-asymptotic oracle inequalities for the Lasso and group Lasso in high dimensional logistic model
- Profiled adaptive elastic-net procedure for partially linear models with high-dimensional covar\-i\-ates
- Goodness-of-fit tests for high-dimensional Gaussian linear models
- Sharp support recovery from noisy random measurements by \(\ell_1\)-minimization
- Coordinate ascent for penalized semiparametric regression on high-dimensional panel count data
- A note on the asymptotic distribution of lasso estimator for correlated data
- Semiparametric regression models with additive nonparametric components and high dimensional parametric components
- Sparse regression learning by aggregation and Langevin Monte-Carlo
- Estimation and variable selection with exponential weights
- Quadratic approximation on SCAD penalized estimation
- Mirror averaging with sparsity priors
- Sparse estimation of high-dimensional inverse covariance matrices with explicit eigenvalue constraints
- Sparse inverse kernel Gaussian Process regression
- Transductive versions of the Lasso and the Dantzig selector
- Fast and adaptive sparse precision matrix estimation in high dimensions
- Variable selection, monotone likelihood ratio and group sparsity
- Generalization of constraints for high dimensional regression problems
- Focused vector information criterion model selection and model averaging regression with missing response
- Rapid penalized likelihood-based outlier detection via heteroskedasticity test
- Bayesian high-dimensional screening via MCMC
- Estimation of high-dimensional partially-observed discrete Markov random fields
- Sparse covariance thresholding for high-dimensional variable selection
- A joint convex penalty for inverse covariance matrix estimation
- CAM: causal additive models, high-dimensional order search and penalized regression
- Bayesian sparse graphical models for classification with application to protein expression data
- High-dimensional Bayesian inference in nonparametric additive models
- An overview of recent developments in genomics and associated statistical methods
- High dimensional change point inference: recent developments and extensions
- Fast global convergence of gradient methods for high-dimensional statistical recovery
- Learning loopy graphical models with latent variables: efficient methods and guarantees
- General nonexact oracle inequalities for classes with a subexponential envelope
- Multivariate Bernoulli distribution
- High-dimensional analysis of semidefinite relaxations for sparse principal components
- Recovering networks from distance data
- Penalized profiled semiparametric estimating functions
- Nonconcave penalized composite conditional likelihood estimation of sparse Ising models
- Sparsity in penalized empirical risk minimization
- High-Dimensional Sparse Additive Hazards Regression
- Exact covariance thresholding into connected components for large-scale graphical lasso
- Dynamic networks with multi-scale temporal structure
- Estimating spatial covariance using penalised likelihood with weightedL1penalty
- Asymptotic properties of Lasso+mLS and Lasso+Ridge in sparse high-dimensional linear regression
- A shrinkage principle for heavy-tailed data: high-dimensional robust low-rank matrix recovery
- Fitting sparse linear models under the sufficient and necessary condition for model identification
- Central limit theorem for linear spectral statistics of general separable sample covariance matrices with applications
- Recovery of partly sparse and dense signals
- Shrinkage tuning parameter selection in precision matrices estimation
- Sparse regression: scalable algorithms and empirical performance
- The smooth-Lasso and other \(\ell _{1}+\ell _{2}\)-penalized methods
- Block-diagonal precision matrix regularization for ultra-high dimensional data
- Data science, big data and statistics
- Model selection and local geometry
- Sorted concave penalized regression
- Learning semidefinite regularizers
- Treelets -- an adaptive multi-scale basis for sparse unordered data
- Gaussian and bootstrap approximations for high-dimensional U-statistics and their applications
- Regularization and the small-ball method. I: Sparse recovery
- Generalized Kalman smoothing: modeling and algorithms
- Discussion: Latent variable graphical model selection via convex optimization
- Necessary and sufficient conditions for variable selection consistency of the Lasso in high dimensions
- Bayesian graphical models for modern biological applications
- Robust machine learning by median-of-means: theory and practice
- Quasi-Bayesian estimation of large Gaussian graphical models
- Estimation of positive definite \(M\)-matrices and structure learning for attractive Gaussian Markov random fields
- Structural learning for Bayesian networks by testing complete separators in prime blocks
- Leave-one-out cross-validation is risk consistent for Lasso
- The Dantzig selector: recovery of signal via ℓ 1 − αℓ 2 minimization
- Estimating finite mixtures of ordinal graphical models
- A sequential scaled pairwise selection approach to edge detection in nonparanormal graphical models
- Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation
- Covariance structure approximation via gLasso in high-dimensional supervised classification
- Sparse high-dimensional linear regression. Estimating squared error and a phase transition
- Estimating heterogeneous graphical models for discrete data with an application to roll call voting
- Learning relational dependency networks in hybrid domains
- Poisson dependency networks: gradient boosted models for multivariate count data
- High-dimensional rank-based graphical models for non-Gaussian functional data
- Inference for Nonparanormal Partial Correlation via Regularized Rank-Based Nodewise Regression
- High-dimensional Gaussian model selection on a Gaussian design
- High dimensional posterior convergence rates for decomposable graphical models
- Edge detection in sparse Gaussian graphical models
- Solving norm constrained portfolio optimization via coordinate-wise descent algorithms
- Discussion: One-step sparse estimates in nonconcave penalized likelihood models
- Laplace error penalty-based variable selection in high dimension
- An efficient method for identifying statistical interactors in gene association networks
- Operator-valued kernel-based vector autoregressive models for network inference
- Copula Gaussian Graphical Models for Functional Data
- Nonparametric and high-dimensional functional graphical models
- On nonparametric feature filters in electromagnetic imaging
- Individual-specific, sparse inverse covariance estimation in generalized estimating equations
- Robust estimation of sparse precision matrix using adaptive weighted graphical lasso approach
- Accelerating a Gibbs sampler for variable selection on genomics data with summarization and variable pre-selection combining an array DBMS and R
- Title not available (Why is that?)
Uses Software
This page was built for publication: High-dimensional graphs and variable selection with the Lasso
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2500458)