Scalable interpretable learning for multi-response error-in-variables regression
From MaRDI portal
Abstract: Corrupted data sets containing noisy or missing observations are prevalent in various contemporary applications such as economics, finance and bioinformatics. Despite the recent methodological and algorithmic advances in high-dimensional multi-response regression, how to achieve scalable and interpretable estimation under contaminated covariates is unclear. In this paper, we develop a new methodology called convex conditioned sequential sparse learning (COSS) for error-in-variables multi-response regression under both additive measurement errors and random missing data. It combines the strengths of the recently developed sequential sparse factor regression and the nearest positive semi-definite matrix projection, thus enjoying stepwise convexity and scalability in large-scale association analyses. Comprehensive theoretical guarantees are provided and we demonstrate the effectiveness of the proposed methodology through numerical studies.
Recommendations
- CoCoLasso for high-dimensional error-in-variables regression
- Balanced estimation for high-dimensional measurement error models
- Sparse estimation in high-dimensional linear errors-in-variables regression via a covariate relaxation method
- Scalable interpretable multi-response regression via SEED
- Calibrated zero-norm regularized LS estimator for high-dimensional error-in-variables regression
Cites work
- Adaptive estimation of a quadratic functional by model selection.
- Balanced estimation for high-dimensional measurement error models
- Calibrated multivariate regression with application to neural semantic basis discovery
- CoCoLasso for high-dimensional error-in-variables regression
- Efficient estimation in the errors in variables model
- High-dimensional regression with noisy and missing data: provable guarantees with nonconvexity
- Identification and QML estimation of multivariate and simultaneous equations spatial autoregressive models
- Joint variable and rank selection for parsimonious estimation of high-dimensional matrices
- Linear and conic programming estimators in high dimensional errors-in-variables models
- Measurement Error in Nonlinear Models
- Missing values: sparse inverse covariance estimation and an extension to sparse regression
- Modern Multivariate Statistical Techniques
- Multivariate spatial autoregressive model for large scale social networks
- Optimal selection of reduced rank estimators of high-dimensional matrices
- SOFAR: Large-Scale Association Network Learning
- Scalable interpretable multi-response regression via SEED
- Scaled sparse linear regression
- Simultaneous analysis of Lasso and Dantzig selector
- Sparse recovery under matrix uncertainty
- Sub-Gaussian random variables
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Variable selection for partially linear models with measurement errors
- Variable selection in measurement error models
Cited in
(4)- Low-rank matrix estimation via nonconvex optimization methods in multi-response errors-in-variables regression
- Parallel integrative learning for large-scale multi-response regression with incomplete outcomes
- Sparse estimation in high-dimensional linear errors-in-variables regression via a covariate relaxation method
- CoCoLasso for high-dimensional error-in-variables regression
This page was built for publication: Scalable interpretable learning for multi-response error-in-variables regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2196126)