scientific article
From MaRDI portal
Publication:2896128
zbMath1242.62069MaRDI QIDQ2896128
Sewoong Oh, Raghunandan H. Keshavan, Andrea Montanari
Publication date: 13 July 2012
Full work available at URL: http://www.jmlr.org/papers/v11/keshavan10a.html
Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Related Items (only showing first 100 items - show all)
Matrix Completion Methods for Causal Panel Data Models ⋮ Low Permutation-rank Matrices: Structural Properties and Noisy Completion ⋮ Model-free Nonconvex Matrix Completion: Local Minima Analysis and Applications in Memory-efficient Kernel PCA ⋮ Sharp Restricted Isometry Bounds for the Inexistence of Spurious Local Minima in Nonconvex Matrix Recovery ⋮ Unnamed Item ⋮ A randomised iterative method for solving factorised linear systems ⋮ Convex and Nonconvex Optimization Are Both Minimax-Optimal for Noisy Blind Deconvolution Under Random Designs ⋮ Operator shifting for noisy elliptic systems ⋮ Low-Rank Regression Models for Multiple Binary Responses and their Applications to Cancer Cell-Line Encyclopedia Data ⋮ Entrywise limit theorems for eigenvectors of signal-plus-noise matrix models with weak signals ⋮ Link Prediction for Egocentrically Sampled Networks ⋮ A Zero-imputation Approach in Recommendation Systems with Data Missing Heterogeneously ⋮ A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers ⋮ An optimal statistical and computational framework for generalized tensor estimation ⋮ Heteroskedastic PCA: algorithm, optimality, and applications ⋮ \(S_{1/2}\) regularization methods and fixed point algorithms for affine rank minimization problems ⋮ Convex optimization learning of faithful Euclidean distance representations in nonlinear dimensionality reduction ⋮ Optimal large-scale quantum state tomography with Pauli measurements ⋮ Role of sparsity and structure in the optimization landscape of non-convex matrix sensing ⋮ Low-Rank and Sparse Dictionary Learning ⋮ Low-Rank Transfer Learning ⋮ On the nuclear norm and the singular value decomposition of tensors ⋮ Tight risk bound for high dimensional time series completion ⋮ Matrix completion via max-norm constrained optimization ⋮ A rank-corrected procedure for matrix completion with fixed basis coefficients ⋮ Noisy tensor completion via the sum-of-squares hierarchy ⋮ Efficient preconditioning for noisy separable nonnegative matrix factorization problems by successive projection based low-rank approximations ⋮ Iterative Collaborative Filtering for Sparse Matrix Estimation ⋮ Unnamed Item ⋮ A graphical approach to the analysis of matrix completion ⋮ Deterministic algorithms for matrix completion ⋮ Unnamed Item ⋮ Alternating proximal gradient method for convex minimization ⋮ Statistically optimal and computationally efficient low rank tensor completion from noisy entries ⋮ Transfer learning in heterogeneous collaborative filtering domains ⋮ Nonconvex Low-Rank Tensor Completion from Noisy Data ⋮ Optimal prediction in the linearly transformed spiked model ⋮ Double instrumental variable estimation of interaction models with big data ⋮ Bayesian singular value regularization via a cumulative shrinkage process ⋮ An alternating minimization method for matrix completion problems ⋮ Matrix completion by singular value thresholding: sharp bounds ⋮ Empirical Bayes matrix completion ⋮ High-dimensional covariance matrix estimation with missing observations ⋮ Low-rank tensor completion by Riemannian optimization ⋮ Implicit regularization in nonconvex statistical estimation: gradient descent converges linearly for phase retrieval, matrix completion, and blind deconvolution ⋮ Random perturbation of low rank matrices: improving classical bounds ⋮ Compressed sensing and matrix completion with constant proportion of corruptions ⋮ Matrix completion with nonconvex regularization: spectral operators and scalable algorithms ⋮ Entrywise eigenvector analysis of random matrices with low expected rank ⋮ Adaptive multinomial matrix completion ⋮ Generalized transfer subspace learning through low-rank constraint ⋮ Solving a low-rank factorization model for matrix completion by a nonlinear successive over-relaxation algorithm ⋮ Compressive Sensing ⋮ Lower bounds for finding stationary points I ⋮ Non-asymptotic approach to varying coefficient model ⋮ Second order accurate distributed eigenvector computation for extremely large matrices ⋮ Rank penalized estimators for high-dimensional matrices ⋮ Adaptive confidence sets for matrix completion ⋮ Asymptotic equivalence of quantum state tomography and noisy matrix completion ⋮ Sampling, denoising and compression of matrices by coherent matrix organization ⋮ Noisy low-rank matrix completion with general sampling distribution ⋮ Noisy Matrix Completion: Understanding Statistical Guarantees for Convex Relaxation via Nonconvex Optimization ⋮ An efficient matrix bi-factorization alternative optimization method for low-rank matrix recovery and completion ⋮ Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion ⋮ Krylov Methods for Low-Rank Regularization ⋮ Stable rank-one matrix completion is solved by the level \(2\) Lasserre relaxation ⋮ Uniform Bounds for Invariant Subspace Perturbations ⋮ Low-Rank Representation of Tensor Network Operators with Long-Range Pairwise Interactions ⋮ Noise-tolerance matrix completion for location recommendation ⋮ A modified augmented Lagrange multiplier algorithm for Toeplitz matrix completion ⋮ Bayesian matrix completion approach to causal inference with panel data ⋮ Adapting Regularized Low-Rank Models for Parallel Architectures ⋮ Rate-optimal perturbation bounds for singular subspaces with applications to high-dimensional statistics ⋮ Max-norm optimization for robust matrix recovery ⋮ Matrix completion based on Gaussian parameterized belief propagation ⋮ Robust matrix completion ⋮ Convex low rank approximation ⋮ Convergence of fixed-point continuation algorithms for matrix rank minimization ⋮ Robust PCA and subspace tracking from incomplete observations using \(\ell _0\)-surrogates ⋮ Fixed point and Bregman iterative methods for matrix rank minimization ⋮ Two Newton methods on the manifold of fixed-rank matrices endowed with Riemannian quotient geometries ⋮ Fixed-rank matrix factorizations and Riemannian low-rank optimization ⋮ Estimation of high-dimensional low-rank matrices ⋮ Estimation of (near) low-rank matrices with noise and high-dimensional scaling ⋮ Unnamed Item ⋮ Sharp variable selection of a sparse submatrix in a high-dimensional noisy matrix ⋮ Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach ⋮ Collaborative filtering with information-rich and~information-sparse entities ⋮ Robust high-dimensional factor models with applications to statistical machine learning ⋮ Robust principal component pursuit via inexact alternating minimization on matrix manifolds ⋮ Subspace estimation from unbalanced and incomplete data matrices: \({\ell_{2,\infty}}\) statistical guarantees ⋮ Decentralized and privacy-preserving low-rank matrix completion ⋮ Linear Models Based on Noisy Data and the Frisch Scheme ⋮ Majorized proximal alternating imputation for regularized rank constrained matrix completion ⋮ Guarantees of Riemannian Optimization for Low Rank Matrix Recovery ⋮ Optspace ⋮ Spectral method and regularized MLE are both optimal for top-\(K\) ranking ⋮ Riemannian Optimization for High-Dimensional Tensor Completion ⋮ Online Collaborative Filtering on Graphs ⋮ Low-rank matrix recovery with composite optimization: good conditioning and rapid convergence
This page was built for publication: