Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements

From MaRDI portal
Revision as of 20:49, 8 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:5280996

DOI10.1109/TIT.2011.2111771zbMath1366.90160OpenAlexW2162451874MaRDI QIDQ5280996

Emmanuel J. Candès, Yaniv Plan

Publication date: 27 July 2017

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1109/tit.2011.2111771




Related Items (only showing first 100 items - show all)

An optimal statistical and computational framework for generalized tensor estimationA shrinkage principle for heavy-tailed data: high-dimensional robust low-rank matrix recoveryConvergence and stability of iteratively reweighted least squares for low-rank matrix recoveryTwo-stage convex relaxation approach to least squares loss constrained low-rank plus sparsity optimization problemsOptimal large-scale quantum state tomography with Pauli measurementsSignal recovery under cumulative coherenceTight risk bound for high dimensional time series completionMatrix completion via max-norm constrained optimizationStable recovery of low-rank matrix via nonconvex Schatten \(p\)-minimizationGeometric inference for general high-dimensional linear inverse problemsEstimation of low rank density matrices: bounds in Schatten norms and other distancesAn inexact proximal DC algorithm with sieving strategy for rank constrained least squares semidefinite programmingSparse Model Uncertainties in Compressed Sensing with Application to Convolutions and Sporadic CommunicationTensor Completion in Hierarchical Tensor RepresentationsSharp MSE bounds for proximal denoisingLow rank matrix recovery from rank one measurementsTrace regression model with simultaneously low rank and row(column) sparse parameterLow rank estimation of smooth kernels on graphsLow rank tensor recovery via iterative hard thresholdingOn the robustness of noise-blind low-rank recovery from rank-one measurementsThe bounds of restricted isometry constants for low rank matrices recoveryOn the Schatten \(p\)-quasi-norm minimization for low-rank matrix recovery\(s\)-goodness for low-rank matrix recoveryRobust recovery of low-rank matrices with non-orthogonal sparse decomposition from incomplete measurementsGuarantees of Riemannian optimization for low rank matrix completionSimple bounds for recovering low-complexity modelsThe convex geometry of linear inverse problemsIterative hard thresholding for low-rank recovery from rank-one projectionsThe minimal measurement number for low-rank matrix recoveryStability of the elastic net estimatorHigh-dimensional VAR with low-rank transitionTerracini convexityRegularized sample average approximation for high-dimensional stochastic optimization under low-ranknessOn signal detection and confidence sets for low rank inference problemsTime for dithering: fast and quantized random embeddings via the restricted isometry propertyHigh-dimensional estimation with geometric constraints: Table 1.Near-optimal estimation of simultaneously sparse and low-rank matrices from nested linear measurementsA perturbation inequality for concave functions of singular values and its applications in low-rank matrix recoveryAdaptive iterative hard thresholding for low-rank matrix recovery and rank-one measurementsStable low-rank matrix recovery via null space propertiesRank penalized estimators for high-dimensional matricesRIPless compressed sensing from anisotropic measurementsOn a unified view of nullspace-type conditions for recoveries associated with general sparsity structuresUniqueness conditions for low-rank matrix recoveryAdaptive confidence sets for matrix completionAsymptotic equivalence of quantum state tomography and noisy matrix completionPainless breakups -- efficient demixing of low rank matricesVon Neumann entropy penalization and low-rank matrix estimationGuaranteed clustering and biclustering via semidefinite programmingA multi-stage convex relaxation approach to noisy structured low-rank matrix recoveryNuclear-norm penalization and optimal rates for noisy low-rank matrix completionMedian-Truncated Gradient Descent: A Robust and Scalable Nonconvex Approach for Signal EstimationCross: efficient low-rank tensor completionSignal recovery under mutual incoherence property and oracle inequalitiesOn the Quadratic Convergence of the Cubic Regularization Method under a Local Error Bound ConditionLearning semidefinite regularizersTheoretical investigation of generalization bounds for adversarial learning of deep neural networksDecomposable norm minimization with proximal-gradient homotopy algorithmDimensionality reduction with subgaussian matrices: a unified theoryRegularization and the small-ball method. I: Sparse recoverySharp RIP bound for sparse signal and low-rank matrix recoveryConvergence of projected Landweber iteration for matrix rank minimizationEquivalent Lipschitz surrogates for zero-norm and rank optimization problemsELASTIC-NET REGULARIZATION FOR LOW-RANK MATRIX RECOVERYSparse recovery with coherent tight frames via analysis Dantzig selector and analysis LASSOTemplates for convex cone problems with applications to sparse signal recoveryEstimation of high-dimensional low-rank matricesEstimation of (near) low-rank matrices with noise and high-dimensional scalingOptimal selection of reduced rank estimators of high-dimensional matricesUnnamed ItemUnnamed ItemSharp variable selection of a sparse submatrix in a high-dimensional noisy matrixGeometric median and robust estimation in Banach spacesLow Complexity Regularization of Linear Inverse ProblemsLow-rank matrix recovery via regularized nuclear norm minimizationRecovery of Low Rank Symmetric Matrices via Schatten p Norm MinimizationConvergence analysis of projected gradient descent for Schatten-\(p\) nonconvex matrix recoveryOn the exponentially weighted aggregate with the Laplace priorOracle posterior contraction rates under hierarchical priorsJoint variable and rank selection for parsimonious estimation of high-dimensional matricesTensor theta norms and low rank recoveryError bound of critical points and KL property of exponent 1/2 for squared F-norm regularized factorizationGuarantees of Riemannian Optimization for Low Rank Matrix RecoveryRIP-based performance guarantee for low-tubal-rank tensor recoveryApproximation of generalized ridge functions in high dimensionsLow-rank matrix recovery with composite optimization: good conditioning and rapid convergenceRegularization and the small-ball method II: complexity dependent error ratesSpectral thresholding for the estimation of Markov chain transition operatorsTruncated sparse approximation property and truncated \(q\)-norm minimizationLow Rank Estimation of Similarities on GraphsNon-intrusive tensor reconstruction for high-dimensional random PDEsEXACT LOW-RANK MATRIX RECOVERY VIA NONCONVEX SCHATTEN p-MINIMIZATIONOptimal RIP bounds for sparse signals recovery via \(\ell_p\) minimizationOn Cross-Validation for Sparse Reduced Rank RegressionROP: matrix recovery via rank-one projectionsTruncated $l_{1-2}$ Models for Sparse Recovery and Rank MinimizationStable recovery of low rank matrices from nuclear norm minimizationStable recovery of analysis based approachesOn two continuum armed bandit problems in high dimensionsSolving variational inequalities with monotone operators on domains given by linear minimization oracles







This page was built for publication: Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements