Low-rank matrix estimation via nonconvex optimization methods in multi-response errors-in-variables regression
DOI10.1007/S10898-023-01293-WOpenAlexW4377943597MaRDI QIDQ6183086FDOQ6183086
Publication date: 26 January 2024
Published in: Journal of Global Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10898-023-01293-w
Recommendations
- Matrix recovery from nonconvex regularized least absolute deviations
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- High-dimensional regression with noisy and missing data: provable guarantees with nonconvexity
- Robust matrix completion
- Noisy matrix decomposition via convex relaxation: optimal rates in high dimensions
nonconvex optimizationlinear convergenceproximal gradient methodsrecovery boundlow-rank regularization
Estimation in multivariate analysis (62H12) Nonconvex programming, global optimization (90C26) Numerical methods for low-rank matrix approximation; matrix compression (65F55)
Cites Work
- Measurement error in Lasso: impact and likelihood bias correction
- Covariate Selection in High-Dimensional Generalized Linear Models With Measurement Error
- Statistics for high-dimensional data. Methods, theory and applications.
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Measurement Error in Nonlinear Models
- Improved matrix uncertainty selector
- Sparse recovery under matrix uncertainty
- High-dimensional regression with noisy and missing data: provable guarantees with nonconvexity
- Modern Multivariate Statistical Techniques
- High-dimensional statistics. A non-asymptotic viewpoint
- Regularized Matrix Regression
- Tensor Regression with Applications in Neuroimaging Data Analysis
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- Restricted strong convexity and weighted matrix completion: optimal bounds with noise
- Convex regularization for high-dimensional multiresponse tensor regression
- Noisy matrix decomposition via convex relaxation: optimal rates in high dimensions
- Regularized \(M\)-estimators with nonconvexity: statistical and algorithmic theory for local optima
- An \(\{\ell_{1},\ell_{2},\ell_{\infty}\}\)-regularization approach to high-dimensional errors-in-variables models
- An optimal statistical and computational framework for generalized tensor estimation
- Efficient estimation in the errors in variables model
- Adaptive Minimax Estimation over Sparse $\ell_q$-Hulls
- Fast global convergence of gradient methods for high-dimensional statistical recovery
- Inference in high dimensional linear measurement error models
- CoCoLasso for high-dimensional error-in-variables regression
- Sparse recovery via nonconvex regularized \(M\)-estimators over \(\ell_q\)-balls
- Linear and Conic Programming Estimators in High Dimensional Errors-in-variables Models
- \(\ell_{2,0}\)-norm based selection and estimation for multivariate generalized linear models
- Non-convex projected gradient descent for generalized low-rank tensor regression
- Estimating sparse networks with hubs
- High-dimensional VAR with low-rank transition
- Scalable interpretable learning for multi-response error-in-variables regression
This page was built for publication: Low-rank matrix estimation via nonconvex optimization methods in multi-response errors-in-variables regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6183086)