Sparse matrix inversion with scaled Lasso
From MaRDI portal
Publication:2933952
zbMATH Open1318.62184arXiv1202.2723MaRDI QIDQ2933952FDOQ2933952
Authors: Tingni Sun, Cun-Hui Zhang
Publication date: 8 December 2014
Abstract: We propose a new method of learning a sparse nonnegative-definite target matrix. Our primary example of the target matrix is the inverse of a population covariance or correlation matrix. The algorithm first estimates each column of the target matrix by the scaled Lasso and then adjusts the matrix estimator to be symmetric. The penalty level of the scaled Lasso for each column is completely determined by data via convex minimization, without using cross-validation. We prove that this scaled Lasso method guarantees the fastest proven rate of convergence in the spectrum norm under conditions of weaker form than those in the existing analyses of other regularized algorithms, and has faster guaranteed rate of convergence when the ratio of the and spectrum norms of the target inverse matrix diverges to infinity. A simulation study demonstrates the computational feasibility and superb performance of the proposed method. Our analysis also provides new performance bounds for the Lasso and scaled Lasso to guarantee higher concentration of the error at a smaller threshold level than previous analyses, and to allow the use of the union bound in column-by-column applications of the scaled Lasso without an adjustment of the penalty level. In addition, the least squares estimation after the scaled Lasso selection is considered and proven to guarantee performance bounds similar to that of the scaled Lasso.
Full work available at URL: https://arxiv.org/abs/1202.2723
Recommendations
linear regressionprecision matrixgraphical modelinverse matrixconcentration matrixspectrum normscaled Lasso
Asymptotic properties of parametric estimators (62F12) Estimation in multivariate analysis (62H12) Ridge regression; shrinkage estimators (Lasso) (62J07)
Cited In (58)
- On cross-validated Lasso in high dimensions
- A study on tuning parameter selection for the high-dimensional lasso
- Complexity analysis of Bayesian learning of high-dimensional DAG models and their equivalence classes
- A self-calibrated direct approach to precision matrix estimation and linear discriminant analysis in high dimensions
- Functional Graphical Models
- Integrative Factor Regression and Its Inference for Multimodal Data Analysis
- Sorted concave penalized regression
- Layer-wise learning strategy for nonparametric tensor product smoothing spline regression and graphical models
- Structure learning of exponential family graphical model with false discovery rate control
- Simultaneous inference for pairwise graphical models with generalized score matching
- Oracle inequalities for high-dimensional prediction
- Statistical Inference for High-Dimensional Models via Recursive Online-Score Estimation
- A tuning-free robust and efficient approach to high-dimensional regression
- High-dimensional Markowitz portfolio optimization problem: empirical comparison of covariance matrix estimators
- Quasi-Bayesian estimation of large Gaussian graphical models
- A two-stage sequential conditional selection approach to sparse high-dimensional multivariate regression models
- An efficient parallel block coordinate descent algorithm for large-scale precision matrix estimation using graphics processing units
- Inference on Multi-level Partial Correlations Based on Multi-subject Time Series Data
- High-dimensional inference for cluster-based graphical models
- A unified precision matrix estimation framework via sparse column-wise inverse operator under weak sparsity
- On estimation of the diagonal elements of a sparse precision matrix
- Minimax estimation of large precision matrices with bandable Cholesky factor
- Fast and adaptive sparse precision matrix estimation in high dimensions
- A sequential scaled pairwise selection approach to edge detection in nonparanormal graphical models
- Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation
- High-dimensional varying index coefficient models via Stein's identity
- Contraction of a quasi-Bayesian model with shrinkage priors in precision matrix estimation
- Efficient computation for differential network analysis with applications to quadratic discriminant analysis
- Second-order Stein: SURE for SURE and other applications in high-dimensional inference
- Confidence regions for entries of a large precision matrix
- On semiparametric exponential family graphical models
- Confidence intervals for high-dimensional inverse covariance estimation
- Asymptotic normality and optimalities in estimation of large Gaussian graphical models
- Honest confidence regions and optimality in high-dimensional precision matrix estimation
- Goodness-of-Fit Tests for High Dimensional Linear Models
- The benefit of group sparsity in group inference with de-biased scaled group Lasso
- Honest Confidence Sets for High-Dimensional Regression by Projection and Shrinkage
- Linear hypothesis testing for high dimensional generalized linear models
- Double-estimation-friendly inference for high-dimensional misspecified models
- An efficient GPU-parallel coordinate descent algorithm for sparse precision matrix estimation via scaled Lasso
- Inference for high-dimensional varying-coefficient quantile regression
- Debiasing convex regularized estimators and interval estimation in linear models
- Innovated scalable dynamic learning for time-varying graphical models
- Convergence of an asynchronous block-coordinate forward-backward algorithm for convex composite optimization
- Uniform joint screening for ultra-high dimensional graphical models
- Estimation of multiple networks with common structures in heterogeneous subgroups
- Empirical risk minimization: probabilistic complexity and stepsize strategy
- Asymptotic bias of the \(\ell_2\)-regularized error variance estimator
- High-dimensional regression with potential prior information on variable importance
- ROCKET: robust confidence intervals via Kendall's tau for transelliptical graphical models
- Testing the differential network between two gaussian graphical models with false discovery rate control
- Bayesian approaches to variable selection: a comparative study from practical perspectives
- Response variable selection in multivariate linear regression
- Support recovery of Gaussian graphical model with false discovery rate control
- Robustly fitting Gaussian graphical models -- the R package robFitConGraph
- Envelope-based partial partial least squares with application to cytokine-based biomarker analysis for COVID-19
- A locally adaptive shrinkage approach to false selection rate control in high-dimensional classification
- Bayesian Structure Learning in Undirected Gaussian Graphical Models: Literature Review with Empirical Comparison
This page was built for publication: Sparse matrix inversion with scaled Lasso
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2933952)