Improved complexities of conditional gradient-type methods with applications to robust matrix recovery problems
DOI10.1007/s10107-019-01452-6zbMath1459.90146arXiv1802.05581OpenAlexW2991366040WikidataQ126641191 ScholiaQ126641191MaRDI QIDQ2227534
Dan Garber, Atara Kaplan, Shoham Sabach
Publication date: 15 February 2021
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1802.05581
convex optimizationsemidefinite programmingconditional gradient methodlow-rank matrix recoveryrobust PCAFrank-Wolfe algorithmnuclear norm minimizationlow-rank optimization
Semidefinite programming (90C22) Large-scale problems in mathematical programming (90C06) Randomized algorithms (68W20) Online algorithms; streaming algorithms (68W27) Robustness in mathematical programming (90C17)
Related Items (3)
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Introductory lectures on convex optimization. A basic course.
- Scalable Robust Matrix Recovery: Frank--Wolfe Meets Proximal Methods
- Robust principal component analysis?
- Robust PCA via Outlier Pursuit
- Regularization and Variable Selection Via the Elastic Net
- A Linearly Convergent Variant of the Conditional Gradient Algorithm under Strong Convexity, with Applications to Online and Stochastic Optimization
This page was built for publication: Improved complexities of conditional gradient-type methods with applications to robust matrix recovery problems