Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion

From MaRDI portal
Publication:661157

DOI10.1214/11-AOS894zbMath1231.62097arXiv1011.6256OpenAlexW2962769133WikidataQ105584275 ScholiaQ105584275MaRDI QIDQ661157

Karim Lounici, Alexandre B. Tsybakov, Vladimir I. Koltchinskii

Publication date: 21 February 2012

Published in: The Annals of Statistics (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1011.6256



Related Items

Matrix Completion Methods for Causal Panel Data Models, An Exact and Robust Conformal Inference Method for Counterfactual and Synthetic Controls, Matrix Completion under Low-Rank Missing Mechanism, Low Permutation-rank Matrices: Structural Properties and Noisy Completion, Lifting for Blind Deconvolution in Random Mask Imaging: Identifiability and Convex Relaxation, Unnamed Item, Deterministic algorithms for matrix completion, Unnamed Item, Unnamed Item, Statistically optimal and computationally efficient low rank tensor completion from noisy entries, Bayesian singular value regularization via a cumulative shrinkage process, Sample average approximation with heavier tails II: localization in stochastic convex optimization and persistence results for the Lasso, Multistage Convex Relaxation Approach to Rank Regularized Minimization Problems Based on Equivalent Mathematical Program with a Generalized Complementarity Constraint, Matrix completion under complex survey sampling, High-dimensional latent panel quantile regression with an application to asset pricing, Oracle inequality for sparse trace regression models with exponential \(\beta\)-mixing errors, The rate of convergence for sparse and low-rank quantile trace regression, Robust Recommendation via Social Network Enhanced Matrix Completion, Sparse and Low-Rank Matrix Quantile Estimation With Application to Quadratic Regression, A framework of regularized low-rank matrix models for regression and classification, Learning Markov Models Via Low-Rank Optimization, A randomised iterative method for solving factorised linear systems, Profile GMM estimation of panel data models with interactive fixed effects, Gradient-Based Markov Chain Monte Carlo for Bayesian Inference With Non-differentiable Priors, The Lasso with structured design and entropy of (absolute) convex hulls, Bayesian uncertainty quantification for low-rank matrix completion, Covariate-assisted matrix completion with multiple structural breaks, A portmanteau local feature discrimination approach to the classification with high-dimensional matrix-variate data, Robust matrix estimations meet Frank-Wolfe algorithm, Low-Rank Regression Models for Multiple Binary Responses and their Applications to Cancer Cell-Line Encyclopedia Data, High-dimensional estimation of quadratic variation based on penalized realized variance, High-dimensional VARs with common factors, Inference for low-rank models, Entrywise limit theorems for eigenvectors of signal-plus-noise matrix models with weak signals, Expectile trace regression via low-rank and group sparsity regularization, Inference for low-rank completion without sample splitting with application to treatment effect estimation, High-dimensional estimation with geometric constraints: Table 1., Near-optimal estimation of simultaneously sparse and low-rank matrices from nested linear measurements, Factor Models for High-Dimensional Tensor Time Series, On a unified view of nullspace-type conditions for recoveries associated with general sparsity structures, Asymptotic equivalence of quantum state tomography and noisy matrix completion, Sparse PCA: optimal rates and adaptive estimation, Noisy low-rank matrix completion with general sampling distribution, On the finite-sample analysis of \(\Theta\)-estimators, Noisy Matrix Completion: Understanding Statistical Guarantees for Convex Relaxation via Nonconvex Optimization, Oracle Inequalities for Local and Global Empirical Risk Minimizers, On the finite-sample analysis of \(\Theta\)-estimators, Unnamed Item, Matrix completion based on Gaussian parameterized belief propagation, Matrix Denoising for Weighted Loss Functions and Heterogeneous Signals, On tight bounds for the Lasso, Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models, Sharp variable selection of a sparse submatrix in a high-dimensional noisy matrix, Projected principal component analysis in factor models, High-dimensional regression with unknown variance, Sparse estimation by exponential weighting, Discussion of ``Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation, FACTORISABLE MULTITASK QUANTILE REGRESSION, Intelligent Initialization and Adaptive Thresholding for Iterative Matrix Completion: Some Statistical and Algorithmic Theory forAdaptive-Impute, Matrix Completion With Covariate Information, Spectral Methods for Passive Imaging: Nonasymptotic Performance and Robustness, Sharp Oracle Inequalities for Square Root Regularization, Regularization and the small-ball method II: complexity dependent error rates, Low Rank Estimation of Similarities on Graphs, On the properties of variational approximations of Gibbs posteriors, Unnamed Item, Operator Splitting for a Homogeneous Embedding of the Linear Complementarity Problem, Estimation of a sparse and spiked covariance matrix, Iterative Methods for Solving Factorized Linear Systems, Unnamed Item, ISLET: Fast and Optimal Low-Rank Tensor Regression via Importance Sketching, An optimal statistical and computational framework for generalized tensor estimation, Adaptive estimation of the copula correlation matrix for semiparametric elliptical copulas, Worst possible sub-directions in high-dimensional models, A shrinkage principle for heavy-tailed data: high-dimensional robust low-rank matrix recovery, On the prediction loss of the Lasso in the partially labeled setting, Convex optimization learning of faithful Euclidean distance representations in nonlinear dimensionality reduction, Ridge-type regularization method for questionnaire data analysis, Low-rank diffusion matrix estimation for high-dimensional time-changed Lévy processes, Optimal large-scale quantum state tomography with Pauli measurements, Generalized co-sparse factor regression, Inference for low-rank tensors -- no need to debias, Aggregated hold out for sparse linear regression with a robust loss function, A distributed Frank-Wolfe framework for learning low-rank matrices with the trace norm, Tight risk bound for high dimensional time series completion, Matrix completion via max-norm constrained optimization, Geometric inference for general high-dimensional linear inverse problems, A rank-corrected procedure for matrix completion with fixed basis coefficients, Estimation of low rank density matrices: bounds in Schatten norms and other distances, Estimation of matrices with row sparsity, Sharp MSE bounds for proximal denoising, A graphical approach to the analysis of matrix completion, Estimation of low-rank covariance function, Trace regression model with simultaneously low rank and row(column) sparse parameter, Low rank estimation of smooth kernels on graphs, How can we identify the sparsity structure pattern of high-dimensional data: an elementary statistical analysis to interpretable machine learning, Optimal prediction in the linearly transformed spiked model, Double instrumental variable estimation of interaction models with big data, Matrix completion by singular value thresholding: sharp bounds, Bayesian linear regression with sparse priors, High-dimensional covariance matrix estimation with missing observations, General inertial proximal gradient method for a class of nonconvex nonsmooth optimization problems, Sparse covariance matrix estimation in high-dimensional deconvolution, Implicit regularization in nonconvex statistical estimation: gradient descent converges linearly for phase retrieval, matrix completion, and blind deconvolution, Random perturbation of low rank matrices: improving classical bounds, Matrix completion with nonconvex regularization: spectral operators and scalable algorithms, High-dimensional VAR with low-rank transition, Entrywise eigenvector analysis of random matrices with low expected rank, Concentration of tempered posteriors and of their variational approximations, On signal detection and confidence sets for low rank inference problems, Adaptive multinomial matrix completion, A perturbation inequality for concave functions of singular values and its applications in low-rank matrix recovery, Detection of a sparse submatrix of a high-dimensional noisy matrix, High-dimensional additive hazards models and the lasso, Non-asymptotic approach to varying coefficient model, Low rank multivariate regression, Rank penalized estimators for high-dimensional matrices, Adaptive confidence sets for matrix completion, Von Neumann entropy penalization and low-rank matrix estimation, A multi-stage convex relaxation approach to noisy structured low-rank matrix recovery, Cross: efficient low-rank tensor completion, Minimax risk of matrix denoising by singular value thresholding, \(L_1\)-penalization in functional linear regression with subgaussian design, Comment on ``Hypothesis testing by convex optimization, Structured volatility matrix estimation for non-synchronized high-frequency financial data, Oracle inequalities for high-dimensional prediction, Outlier detection in networks with missing links, One-bit tensor completion via transformed tensor singular value decomposition, Maximum likelihood estimation of sparse networks with missing observations, Max-norm optimization for robust matrix recovery, Concentration inequalities for matrix martingales in continuous time, Robust low-rank matrix estimation, Slope meets Lasso: improved oracle bounds and optimality, Robust matrix completion, Sub-Gaussian estimators of the mean of a random matrix with heavy-tailed entries, Regularization and the small-ball method. I: Sparse recovery, Towards optimal estimation of bivariate isotonic matrices with unknown permutations, Degrees of freedom in low rank matrix estimation, Learning with tensors: a framework based on convex optimization and spectral regularization, Matrix factorization for multivariate time series analysis, Parametric and semiparametric reduced-rank regression with flexible sparsity, Prediction error bounds for linear regression with the TREX, Leveraging mixed and incomplete outcomes via reduced-rank modeling, Geometric median and robust estimation in Banach spaces, Numerical comparisons between Bayesian and frequentist low-rank matrix completion: estimation accuracy and uncertainty quantification, Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach, On the exponentially weighted aggregate with the Laplace prior, Double fused Lasso regularized regression with both matrix and vector valued predictors, Robust regression via mutivariate regression depth, Joint variable and rank selection for parsimonious estimation of high-dimensional matrices, Fast global convergence of gradient methods for high-dimensional statistical recovery, Regularization parameter selection for the low rank matrix recovery, Provable accelerated gradient method for nonconvex low rank optimization, Bridging convex and nonconvex optimization in robust PCA: noise, outliers and missing data, Sharp oracle inequalities for low-complexity priors, Data fusion using factor analysis and low-rank matrix completion, Estimation bounds and sharp oracle inequalities of regularized procedures with Lipschitz loss functions, Spectral thresholding for the estimation of Markov chain transition operators, Generalized high-dimensional trace regression via nuclear norm regularization, Localized Gaussian width of \(M\)-convex hulls with applications to Lasso and convex aggregation, Structured matrix estimation and completion, Nonparametric estimation of low rank matrix valued function, Doubly penalized estimation in additive regression with high-dimensional data, On relaxed greedy randomized iterative methods for the solution of factorized linear systems, Rapid, robust, and reliable blind deconvolution via nonconvex optimization, ROP: matrix recovery via rank-one projections, Matrix estimation by universal singular value thresholding, A Bayesian approach for noisy matrix completion: optimal rate under general sampling distribution, Lasso and probabilistic inequalities for multivariate point processes, Conditional rotation between forecasting models, Proof methods for robust low-rank matrix recovery



Cites Work