Tuning-free heterogeneous inference in massive networks
DOI10.1080/01621459.2018.1537920zbMATH Open1428.62098arXiv1606.03803OpenAlexW2905419188WikidataQ128753214 ScholiaQ128753214MaRDI QIDQ148592FDOQ148592
Authors: Zhao Ren, Yongjian Kang, Yingying Fan, Jinchi Lv, Zhao Ren, Yongjian Kang, Yingying Fan, Jinchi Lv
Publication date: 13 June 2016
Published in: Journal of the American Statistical Association (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1606.03803
Recommendations
multiple networksefficiencysparsityhigh dimensionalityheterogeneous group square-root Lassoheterogeneous learninglarge-scale inferencescalability
Point estimation (62F10) Asymptotic properties of parametric estimators (62F12) Time series, auto-correlation, regression, etc. in statistics (GARCH) (62M10) Asymptotic distribution theory in statistics (62E20)
Cites Work
- Title not available (Why is that?)
- Asymptotic normality and optimalities in estimation of large Gaussian graphical models
- Gaussian graphical model estimation with false discovery rate control
- Graphical models, exponential families, and variational inference
- Title not available (Why is that?)
- High-dimensional graphs and variable selection with the Lasso
- The benefit of group sparsity in group inference with de-biased scaled group Lasso
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Title not available (Why is that?)
- Model Selection and Estimation in Regression with Grouped Variables
- Sparse inverse covariance estimation with the graphical lasso
- Network exploration via the adaptive LASSO and SCAD penalties
- Scaled sparse linear regression
- Model selection and estimation in the Gaussian graphical model
- Title not available (Why is that?)
- Joint estimation of multiple graphical models
- The Joint Graphical Lasso for Inverse Covariance Estimation Across Multiple Classes
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- High dimensional inverse covariance matrix estimation via linear programming
- A constrained \(\ell _{1}\) minimization approach to sparse precision matrix estimation
- Adaptive estimation of a quadratic functional by model selection.
- Nonparametric goodness-of-fit testing under Gaussian models
- Non-asymptotic minimax rates of testing in signal detection
- Time varying undirected graphs
- Minimax estimation of linear and quadratic functionals on sparsity classes
- Covariance and precision matrix estimation for high-dimensional time series
- Joint estimation of multiple high-dimensional precision matrices
- Innovated scalable efficient estimation in ultra-large Gaussian graphical models
- Maximum Likelihood Estimation of a Multi-Dimensional Log-Concave Density
- Structural Pursuit Over Multiple Undirected Graphs
- The Group Square-Root Lasso: Theoretical Properties and Fast Algorithms
- Joint Estimation of Multiple Graphical Models from High Dimensional Time Series
- Sparse precision matrix estimation via lasso penalized D-trace loss
- Scalable Algorithms for Data and Network Analysis
- Oracle inequalities and optimal inference under group sparsity
- Estimating time-varying networks
- The benefit of group sparsity
Cited In (4)
Uses Software
This page was built for publication: Tuning-free heterogeneous inference in massive networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q148592)