Estimation of high-dimensional low-rank matrices
From MaRDI portal
Publication:548539
DOI10.1214/10-AOS860zbMath1215.62056arXiv0912.5338OpenAlexW2281296685MaRDI QIDQ548539
Angelika Rohde, Alexandre B. Tsybakov
Publication date: 29 June 2011
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0912.5338
empirical processsparse recoverySchatten normpenalized least-squares estimatorquasi-convex Schatten class embeddings
Estimation in multivariate analysis (62H12) Nonparametric estimation (62G05) Point estimation (62F10)
Related Items
Adaptive estimation of the copula correlation matrix for semiparametric elliptical copulas, Approximation, Gelfand, and Kolmogorov numbers of Schatten class embeddings, Direct shrinkage estimation of large dimensional precision matrix, Matrix Completion Methods for Causal Panel Data Models, An Exact and Robust Conformal Inference Method for Counterfactual and Synthetic Controls, A shrinkage principle for heavy-tailed data: high-dimensional robust low-rank matrix recovery, Matrix Completion under Low-Rank Missing Mechanism, Reconstruction of a high-dimensional low-rank matrix, \(S_{1/2}\) regularization methods and fixed point algorithms for affine rank minimization problems, Low-rank diffusion matrix estimation for high-dimensional time-changed Lévy processes, Optimal large-scale quantum state tomography with Pauli measurements, Stable estimation of a covariance matrix guided by nuclear norm penalties, Matrix completion via max-norm constrained optimization, Distributed estimation in heterogeneous reduced rank regression: with application to order determination in sufficient dimension reduction, Geometric inference for general high-dimensional linear inverse problems, A rank-corrected procedure for matrix completion with fixed basis coefficients, Estimation of matrices with row sparsity, Unnamed Item, Global optimality condition and fixed point continuation algorithm for non-Lipschitz \(\ell_p\) regularized matrix minimization, A generalized likelihood ratio test for normal mean when \(p\) is greater than \(n\), Bayesian sparse reduced rank multivariate regression, Low rank estimation of smooth kernels on graphs, Statistically optimal and computationally efficient low rank tensor completion from noisy entries, Entropy numbers of embeddings of Schatten classes, Optimal prediction in the linearly transformed spiked model, Exponential weights in multivariate regression and a low-rankness favoring prior, Bayesian singular value regularization via a cumulative shrinkage process, Reconstruction of a low-rank matrix in the presence of Gaussian noise, High-dimensional covariance matrix estimation with missing observations, Exact minimum rank approximation via Schatten \(p\)-norm minimization, High-dimensional latent panel quantile regression with an application to asset pricing, Sparse and Low-Rank Matrix Quantile Estimation With Application to Quadratic Regression, A framework of regularized low-rank matrix models for regression and classification, An efficient semi-proximal ADMM algorithm for low-rank and sparse regularized matrix minimization problems with real-world applications, Profile GMM estimation of panel data models with interactive fixed effects, Universal Features for High-Dimensional Learning and Inference, Covariate-assisted matrix completion with multiple structural breaks, Efficient estimation of approximate factor models via penalized maximum likelihood, Robust matrix estimations meet Frank-Wolfe algorithm, Matrix completion with nonconvex regularization: spectral operators and scalable algorithms, Low-Rank Regression Models for Multiple Binary Responses and their Applications to Cancer Cell-Line Encyclopedia Data, Regularized sample average approximation for high-dimensional stochastic optimization under low-rankness, High-dimensional VARs with common factors, Inference for low-rank models, Understanding Implicit Regularization in Over-Parameterized Single Index Model, Near-optimal estimation of simultaneously sparse and low-rank matrices from nested linear measurements, Low rank multivariate regression, Rank penalized estimators for high-dimensional matrices, Adaptive confidence sets for matrix completion, Asymptotic equivalence of quantum state tomography and noisy matrix completion, Noisy matrix decomposition via convex relaxation: optimal rates in high dimensions, Von Neumann entropy penalization and low-rank matrix estimation, Sparse PCA: optimal rates and adaptive estimation, Noisy low-rank matrix completion with general sampling distribution, Multi-Armed Angle-Based Direct Learning for Estimating Optimal Individualized Treatment Rules With Various Outcomes, On the strong convergence of the optimal linear shrinkage estimator for large dimensional covariance matrix, Noisy Matrix Completion: Understanding Statistical Guarantees for Convex Relaxation via Nonconvex Optimization, Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion, Cross: efficient low-rank tensor completion, Minimax risk of matrix denoising by singular value thresholding, Max-norm optimization for robust matrix recovery, Concentration inequalities for matrix martingales in continuous time, Decomposable norm minimization with proximal-gradient homotopy algorithm, Robust low-rank matrix estimation, Robust matrix completion, Regularization and the small-ball method. I: Sparse recovery, Degrees of freedom in low rank matrix estimation, Sparse trace norm regularization, Estimation of (near) low-rank matrices with noise and high-dimensional scaling, Optimal selection of reduced rank estimators of high-dimensional matrices, Sharp variable selection of a sparse submatrix in a high-dimensional noisy matrix, Geometric median and robust estimation in Banach spaces, Adaptive estimation of the rank of the coefficient matrix in high-dimensional multivariate response regression models, Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach, The analysis of multivariate data using semi-definite programming, High-dimensional regression with unknown variance, A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers, Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation, Discussion of ``Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation, On the exponentially weighted aggregate with the Laplace prior, Oracle posterior contraction rates under hierarchical priors, Double fused Lasso regularized regression with both matrix and vector valued predictors, Nonlinear shrinkage estimation of large-dimensional covariance matrices, Linear Models Based on Noisy Data and the Frisch Scheme, Joint variable and rank selection for parsimonious estimation of high-dimensional matrices, Fast global convergence of gradient methods for high-dimensional statistical recovery, Regularization parameter selection for the low rank matrix recovery, Gelfand numbers of embeddings of Schatten classes, Provable accelerated gradient method for nonconvex low rank optimization, Recent advances in shrinkage-based high-dimensional inference, Matrix Completion With Covariate Information, Estimation bounds and sharp oracle inequalities of regularized procedures with Lipschitz loss functions, Regularization and the small-ball method II: complexity dependent error rates, Estimating high-dimensional covariance and precision matrices under general missing dependence, Low Rank Estimation of Similarities on Graphs, Nonparametric estimation of low rank matrix valued function, On Cross-Validation for Sparse Reduced Rank Regression, ROP: matrix recovery via rank-one projections, Matrix estimation by universal singular value thresholding
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Exponential screening and optimal rates of sparse estimation
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- Optimal selection of reduced rank estimators of high-dimensional matrices
- Oracle inequalities and optimal inference under group sparsity
- User-friendly tail bounds for sums of random matrices
- High-dimensional analysis of semidefinite relaxations for sparse principal components
- Optimal rates of convergence for covariance matrix estimation
- The risk inflation criterion for multiple regression
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- Simultaneous analysis of Lasso and Dantzig selector
- Aggregation for Gaussian regression
- Reconstruction and subgaussian operators in asymptotic geometric analysis
- Regularized estimation of large covariance matrices
- High-dimensional graphs and variable selection with the Lasso
- \(c_ p\)
- Square root penalty: Adaption to the margin in classification and in edge estimation
- Exact matrix completion via convex optimization
- Decoding by Linear Programming
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- Remarks on Inequalities for Large Deviation Probabilities
- Entropy Numbers and Approximation Numbers in Function Spacess
- Prox-Method with Rate of Convergence O(1/t) for Variational Inequalities with Lipschitz Continuous Monotone Operators and Smooth Convex-Concave Saddle Point Problems
- Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
- Recovering Low-Rank Matrices From Few Coefficients in Any Basis
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- A Simpler Approach to Matrix Completion
- Learning Theory
- Introduction to nonparametric estimation
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers