Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
From MaRDI portal
Publication:5280996
DOI10.1109/TIT.2011.2111771zbMath1366.90160OpenAlexW2162451874MaRDI QIDQ5280996
Emmanuel J. Candès, Yaniv Plan
Publication date: 27 July 2017
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/tit.2011.2111771
Convex programming (90C25) Signal theory (characterization, reconstruction, filtering, etc.) (94A12)
Related Items (only showing first 100 items - show all)
An optimal statistical and computational framework for generalized tensor estimation ⋮ A shrinkage principle for heavy-tailed data: high-dimensional robust low-rank matrix recovery ⋮ Convergence and stability of iteratively reweighted least squares for low-rank matrix recovery ⋮ Two-stage convex relaxation approach to least squares loss constrained low-rank plus sparsity optimization problems ⋮ Optimal large-scale quantum state tomography with Pauli measurements ⋮ Signal recovery under cumulative coherence ⋮ Tight risk bound for high dimensional time series completion ⋮ Matrix completion via max-norm constrained optimization ⋮ Stable recovery of low-rank matrix via nonconvex Schatten \(p\)-minimization ⋮ Geometric inference for general high-dimensional linear inverse problems ⋮ Estimation of low rank density matrices: bounds in Schatten norms and other distances ⋮ An inexact proximal DC algorithm with sieving strategy for rank constrained least squares semidefinite programming ⋮ Sparse Model Uncertainties in Compressed Sensing with Application to Convolutions and Sporadic Communication ⋮ Tensor Completion in Hierarchical Tensor Representations ⋮ Sharp MSE bounds for proximal denoising ⋮ Low rank matrix recovery from rank one measurements ⋮ Trace regression model with simultaneously low rank and row(column) sparse parameter ⋮ Low rank estimation of smooth kernels on graphs ⋮ Low rank tensor recovery via iterative hard thresholding ⋮ On the robustness of noise-blind low-rank recovery from rank-one measurements ⋮ The bounds of restricted isometry constants for low rank matrices recovery ⋮ On the Schatten \(p\)-quasi-norm minimization for low-rank matrix recovery ⋮ \(s\)-goodness for low-rank matrix recovery ⋮ Robust recovery of low-rank matrices with non-orthogonal sparse decomposition from incomplete measurements ⋮ Guarantees of Riemannian optimization for low rank matrix completion ⋮ Simple bounds for recovering low-complexity models ⋮ The convex geometry of linear inverse problems ⋮ Iterative hard thresholding for low-rank recovery from rank-one projections ⋮ The minimal measurement number for low-rank matrix recovery ⋮ Stability of the elastic net estimator ⋮ High-dimensional VAR with low-rank transition ⋮ Terracini convexity ⋮ Regularized sample average approximation for high-dimensional stochastic optimization under low-rankness ⋮ On signal detection and confidence sets for low rank inference problems ⋮ Time for dithering: fast and quantized random embeddings via the restricted isometry property ⋮ High-dimensional estimation with geometric constraints: Table 1. ⋮ Near-optimal estimation of simultaneously sparse and low-rank matrices from nested linear measurements ⋮ A perturbation inequality for concave functions of singular values and its applications in low-rank matrix recovery ⋮ Adaptive iterative hard thresholding for low-rank matrix recovery and rank-one measurements ⋮ Stable low-rank matrix recovery via null space properties ⋮ Rank penalized estimators for high-dimensional matrices ⋮ RIPless compressed sensing from anisotropic measurements ⋮ On a unified view of nullspace-type conditions for recoveries associated with general sparsity structures ⋮ Uniqueness conditions for low-rank matrix recovery ⋮ Adaptive confidence sets for matrix completion ⋮ Asymptotic equivalence of quantum state tomography and noisy matrix completion ⋮ Painless breakups -- efficient demixing of low rank matrices ⋮ Von Neumann entropy penalization and low-rank matrix estimation ⋮ Guaranteed clustering and biclustering via semidefinite programming ⋮ A multi-stage convex relaxation approach to noisy structured low-rank matrix recovery ⋮ Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion ⋮ Median-Truncated Gradient Descent: A Robust and Scalable Nonconvex Approach for Signal Estimation ⋮ Cross: efficient low-rank tensor completion ⋮ Signal recovery under mutual incoherence property and oracle inequalities ⋮ On the Quadratic Convergence of the Cubic Regularization Method under a Local Error Bound Condition ⋮ Learning semidefinite regularizers ⋮ Theoretical investigation of generalization bounds for adversarial learning of deep neural networks ⋮ Decomposable norm minimization with proximal-gradient homotopy algorithm ⋮ Dimensionality reduction with subgaussian matrices: a unified theory ⋮ Regularization and the small-ball method. I: Sparse recovery ⋮ Sharp RIP bound for sparse signal and low-rank matrix recovery ⋮ Convergence of projected Landweber iteration for matrix rank minimization ⋮ Equivalent Lipschitz surrogates for zero-norm and rank optimization problems ⋮ ELASTIC-NET REGULARIZATION FOR LOW-RANK MATRIX RECOVERY ⋮ Sparse recovery with coherent tight frames via analysis Dantzig selector and analysis LASSO ⋮ Templates for convex cone problems with applications to sparse signal recovery ⋮ Estimation of high-dimensional low-rank matrices ⋮ Estimation of (near) low-rank matrices with noise and high-dimensional scaling ⋮ Optimal selection of reduced rank estimators of high-dimensional matrices ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Sharp variable selection of a sparse submatrix in a high-dimensional noisy matrix ⋮ Geometric median and robust estimation in Banach spaces ⋮ Low Complexity Regularization of Linear Inverse Problems ⋮ Low-rank matrix recovery via regularized nuclear norm minimization ⋮ Recovery of Low Rank Symmetric Matrices via Schatten p Norm Minimization ⋮ Convergence analysis of projected gradient descent for Schatten-\(p\) nonconvex matrix recovery ⋮ On the exponentially weighted aggregate with the Laplace prior ⋮ Oracle posterior contraction rates under hierarchical priors ⋮ Joint variable and rank selection for parsimonious estimation of high-dimensional matrices ⋮ Tensor theta norms and low rank recovery ⋮ Error bound of critical points and KL property of exponent 1/2 for squared F-norm regularized factorization ⋮ Guarantees of Riemannian Optimization for Low Rank Matrix Recovery ⋮ RIP-based performance guarantee for low-tubal-rank tensor recovery ⋮ Approximation of generalized ridge functions in high dimensions ⋮ Low-rank matrix recovery with composite optimization: good conditioning and rapid convergence ⋮ Regularization and the small-ball method II: complexity dependent error rates ⋮ Spectral thresholding for the estimation of Markov chain transition operators ⋮ Truncated sparse approximation property and truncated \(q\)-norm minimization ⋮ Low Rank Estimation of Similarities on Graphs ⋮ Non-intrusive tensor reconstruction for high-dimensional random PDEs ⋮ EXACT LOW-RANK MATRIX RECOVERY VIA NONCONVEX SCHATTEN p-MINIMIZATION ⋮ Optimal RIP bounds for sparse signals recovery via \(\ell_p\) minimization ⋮ On Cross-Validation for Sparse Reduced Rank Regression ⋮ ROP: matrix recovery via rank-one projections ⋮ Truncated $l_{1-2}$ Models for Sparse Recovery and Rank Minimization ⋮ Stable recovery of low rank matrices from nuclear norm minimization ⋮ Stable recovery of analysis based approaches ⋮ On two continuum armed bandit problems in high dimensions ⋮ Solving variational inequalities with monotone operators on domains given by linear minimization oracles
This page was built for publication: Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements