Low-rank matrix estimation via nonconvex optimization methods in multi-response errors-in-variables regression
From MaRDI portal
Publication:6183086
Recommendations
- Matrix recovery from nonconvex regularized least absolute deviations
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- High-dimensional regression with noisy and missing data: provable guarantees with nonconvexity
- Robust matrix completion
- Noisy matrix decomposition via convex relaxation: optimal rates in high dimensions
Cites work
- Adaptive Minimax Estimation over Sparse $\ell_q$-Hulls
- An \(\{\ell_{1},\ell_{2},\ell_{\infty}\}\)-regularization approach to high-dimensional errors-in-variables models
- An optimal statistical and computational framework for generalized tensor estimation
- CoCoLasso for high-dimensional error-in-variables regression
- Convex regularization for high-dimensional multiresponse tensor regression
- Covariate Selection in High-Dimensional Generalized Linear Models With Measurement Error
- Efficient estimation in the errors in variables model
- Estimating sparse networks with hubs
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- Fast global convergence of gradient methods for high-dimensional statistical recovery
- Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization
- High-dimensional VAR with low-rank transition
- High-dimensional regression with noisy and missing data: provable guarantees with nonconvexity
- High-dimensional statistics. A non-asymptotic viewpoint
- Improved matrix uncertainty selector
- Inference in high dimensional linear measurement error models
- Linear and conic programming estimators in high dimensional errors-in-variables models
- Measurement Error in Nonlinear Models
- Measurement error in Lasso: impact and likelihood bias correction
- Modern Multivariate Statistical Techniques
- Noisy matrix decomposition via convex relaxation: optimal rates in high dimensions
- Non-convex projected gradient descent for generalized low-rank tensor regression
- Regularized Matrix Regression
- Regularized \(M\)-estimators with nonconvexity: statistical and algorithmic theory for local optima
- Restricted strong convexity and weighted matrix completion: optimal bounds with noise
- Scalable interpretable learning for multi-response error-in-variables regression
- Sparse recovery under matrix uncertainty
- Sparse recovery via nonconvex regularized \(M\)-estimators over \(\ell_q\)-balls
- Statistics for high-dimensional data. Methods, theory and applications.
- Tensor Regression with Applications in Neuroimaging Data Analysis
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- \(\ell_{2,0}\)-norm based selection and estimation for multivariate generalized linear models
This page was built for publication: Low-rank matrix estimation via nonconvex optimization methods in multi-response errors-in-variables regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6183086)