Convergence of fixed-point continuation algorithms for matrix rank minimization

From MaRDI portal
Revision as of 06:56, 30 January 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:535287

DOI10.1007/S10208-011-9084-6zbMath1219.90195arXiv0906.3499OpenAlexW2001801912MaRDI QIDQ535287

Donald Goldfarb, Shi-Qian Ma

Publication date: 11 May 2011

Published in: Foundations of Computational Mathematics (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/0906.3499






Related Items (29)

On linear convergence of projected gradient method for a class of affine rank minimization problemsLow-Rank and Sparse Multi-task LearningMinimum \( n\)-rank approximation via iterative hard thresholdingGuarantees of Riemannian optimization for low rank matrix completionFast alternating linearization methods for minimizing the sum of two convex functionsIterative hard thresholding for low-rank recovery from rank-one projectionsAn approximation theory of matrix rank minimization and its application to quadratic equationsNonnegative Low Rank Matrix Completion by Riemannian Optimalization MethodsAccelerated linearized Bregman methodCGIHT: conjugate gradient iterative hard thresholding for compressed sensing and matrix completionCompressive SensingKrylov Methods for Low-Rank RegularizationLearning semidefinite regularizersFast and provable algorithms for spectrally sparse signal reconstruction via low-rank Hankel matrix completionConvergence of iterative hard-thresholding algorithm with continuationMatrix recipes for hard thresholding methodsConvergence of projected Landweber iteration for matrix rank minimizationLow-Rank Tensor Recovery using Sequentially Optimal Modal Projections in Iterative Hard Thresholding (SeMPIHT)Fixed point and Bregman iterative methods for matrix rank minimizationHomotopy method for matrix rank minimization based on the matrix hard thresholding methodLow-rank matrix recovery via regularized nuclear norm minimizationConvergence analysis of projected gradient descent for Schatten-\(p\) nonconvex matrix recoveryA simple prior-free method for non-rigid structure-from-motion factorizationGuarantees of Riemannian Optimization for Low Rank Matrix RecoveryMinimization of the difference of Nuclear and Frobenius norms for noisy low rank matrix recoveryAn adaptation for iterative structured matrix completionQuasi-linear Compressed SensingWeighted lp − l1 minimization methods for block sparse recovery and rank minimizationStable recovery of low rank matrices from nuclear norm minimization


Uses Software



Cites Work




This page was built for publication: Convergence of fixed-point continuation algorithms for matrix rank minimization