Tuning-free heterogeneous inference in massive networks
From MaRDI portal
(Redirected from Publication:148592)
Abstract: Heterogeneity is often natural in many contemporary applications involving massive data. While posing new challenges to effective learning, it can play a crucial role in powering meaningful scientific discoveries through the understanding of important differences among subpopulations of interest. In this paper, we exploit multiple networks with Gaussian graphs to encode the connectivity patterns of a large number of features on the subpopulations. To uncover the heterogeneity of these structures across subpopulations, we suggest a new framework of tuning-free heterogeneity pursuit (THP) via large-scale inference, where the number of networks is allowed to diverge. In particular, two new tests, the chi-based test and the linear functional-based test, are introduced and their asymptotic null distributions are established. Under mild regularity conditions, we establish that both tests are optimal in achieving the testable region boundary and the sample size requirement for the latter test is minimal. Both theoretical guarantees and the tuning-free feature stem from efficient multiple-network estimation by our newly suggested approach of heterogeneous group square-root Lasso (HGSL) for high-dimensional multi-response regression with heterogeneous noises. To solve this convex program, we further introduce a tuning-free algorithm that is scalable and enjoys provable convergence to the global optimum. Both computational and theoretical advantages of our procedure are elucidated through simulation and real data examples.
Recommendations
Cites work
- scientific article; zbMATH DE number 720689 (Why is no real title available?)
- scientific article; zbMATH DE number 1134987 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A constrained \(\ell _{1}\) minimization approach to sparse precision matrix estimation
- Adaptive estimation of a quadratic functional by model selection.
- Asymptotic normality and optimalities in estimation of large Gaussian graphical models
- Calibrated multivariate regression with application to neural semantic basis discovery
- Covariance and precision matrix estimation for high-dimensional time series
- Estimating time-varying networks
- Gaussian graphical model estimation with false discovery rate control
- Graphical models, exponential families, and variational inference
- High dimensional inverse covariance matrix estimation via linear programming
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- High-dimensional graphs and variable selection with the Lasso
- Innovated scalable efficient estimation in ultra-large Gaussian graphical models
- Joint estimation of multiple graphical models
- Joint estimation of multiple graphical models from high dimensional time series
- Joint estimation of multiple high-dimensional precision matrices
- Maximum likelihood estimation of a multi-dimensional log-concave density. With discussion and authors' reply
- Minimax estimation of linear and quadratic functionals on sparsity classes
- Model Selection and Estimation in Regression with Grouped Variables
- Model selection and estimation in the Gaussian graphical model
- Network exploration via the adaptive LASSO and SCAD penalties
- Non-asymptotic minimax rates of testing in signal detection
- Nonparametric goodness-of-fit testing under Gaussian models
- Oracle inequalities and optimal inference under group sparsity
- Scalable Algorithms for Data and Network Analysis
- Scaled sparse linear regression
- Sparse inverse covariance estimation with the graphical lasso
- Sparse precision matrix estimation via lasso penalized D-trace loss
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Structural pursuit over multiple undirected graphs
- The Group Square-Root Lasso: Theoretical Properties and Fast Algorithms
- The Joint Graphical Lasso for Inverse Covariance Estimation Across Multiple Classes
- The benefit of group sparsity
- The benefit of group sparsity in group inference with de-biased scaled group Lasso
- Time varying undirected graphs
Cited in
(6)- Hypothesis testing for high-dimensional multivariate regression with false discovery rate control
- Inference on Multi-level Partial Correlations Based on Multi-subject Time Series Data
- A partially linear framework for massive heterogeneous data
- Heterogeneity adjustment with applications to graphical model inference
- HGSL
- RANK: Large-Scale Inference With Graphical Nonlinear Knockoffs
This page was built for publication: Tuning-free heterogeneous inference in massive networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q148592)