Estimation of (near) low-rank matrices with noise and high-dimensional scaling
From MaRDI portal
Publication:548547
DOI10.1214/10-AOS850zbMATH Open1216.62090arXiv0912.5100OpenAlexW2586353914MaRDI QIDQ548547FDOQ548547
Authors: Martin J. Wainwright, Sahand Negahban
Publication date: 29 June 2011
Published in: The Annals of Statistics (Search for Journal in Brave)
Abstract: High-dimensional inference refers to problems of statistical estimation in which the ambient dimension of the data may be comparable to or possibly even larger than the sample size. We study an instance of high-dimensional inference in which the goal is to estimate a matrix on the basis of noisy observations, and the unknown matrix is assumed to be either exactly low rank, or ``near low-rank, meaning that it can be well-approximated by a matrix with low rank. We consider an -estimator based on regularization by the trace or nuclear norm over matrices, and analyze its performance under high-dimensional scaling. We provide non-asymptotic bounds on the Frobenius norm error that hold for a general class of noisy observation models, and then illustrate their consequences for a number of specific matrix models, including low-rank multivariate or multi-task regression, system identification in vector autoregressive processes, and recovery of low-rank matrices from random projections. Simulation results show excellent agreement with the high-dimensional scaling of the error predicted by our theory.
Full work available at URL: https://arxiv.org/abs/0912.5100
Recommendations
- Estimation of high-dimensional low-rank matrices
- High dimensional matrix estimation with unknown variance of the noise
- Asymptotic theory for estimating the singular vectors and values of a partially-observed low rank matrix with noise
- Robust low-rank matrix estimation
- Low-rank Matrix Recovery from Noisy, Quantized and Erroneous Measurements
- Reconstruction of a low-rank matrix in the presence of Gaussian noise
- Near-optimal estimation of simultaneously sparse and low-rank matrices from nested linear measurements
- Estimation of low-rank matrices via approximate message passing
- Rank penalized estimators for high-dimensional matrices
- A unified approach to estimating a normal mean matrix in high and low dimensions
high-dimensional inferencenuclear normrandom matrix theory\(M\)-estimatorstrace normrank constraints
Cites Work
- Matrix completion from noisy entries
- Matrix Analysis
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Title not available (Why is that?)
- Title not available (Why is that?)
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional graphs and variable selection with the Lasso
- Atomic Decomposition by Basis Pursuit
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Model Selection and Estimation in Regression with Grouped Variables
- Title not available (Why is that?)
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- Covariance regularization by thresholding
- Sparse inverse covariance estimation with the graphical lasso
- On the distribution of the largest eigenvalue in principal components analysis
- Sparse permutation invariant covariance estimation
- Regularized estimation of large covariance matrices
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- Model selection and estimation in the Gaussian graphical model
- Exact matrix completion via convex optimization
- Minimax Rates of Estimation for High-Dimensional Linear Regression Over $\ell_q$-Balls
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- Decoding by Linear Programming
- Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization
- Semidefinite Programming
- Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
- Restricted strong convexity and weighted matrix completion: optimal bounds with noise
- Estimation of high-dimensional low-rank matrices
- Regularization in statistics
- Operator norm consistent estimation of large-dimensional sparse covariance matrices
- Compressed sensing
- The benefit of group sparsity
- Compressed sensing and best \(k\)-term approximation
- Title not available (Why is that?)
- Consistency of trace norm minimization
- Interior-point method for nuclear norm approximation with application to system identification
- Support union recovery in high-dimensional multivariate regression
- Null space conditions and thresholds for rank minimization
- High-dimensional analysis of semidefinite relaxations for sparse principal components
- Sparse recovery problems in high dimensions: statistical inference and learning theory. Abstracts from the mini-workshop held March 15th -- March 21st, 2009.
Cited In (only showing first 100 items - show all)
- On signal detection and confidence sets for low rank inference problems
- A shrinkage principle for heavy-tailed data: high-dimensional robust low-rank matrix recovery
- Low rank multivariate regression
- Generalized co-sparse factor regression
- A multi-stage convex relaxation approach to noisy structured low-rank matrix recovery
- Convex relaxation algorithm for a structured simultaneous low-rank and sparse recovery problem
- Noisy matrix decomposition via convex relaxation: optimal rates in high dimensions
- On the degrees of freedom of mixed matrix regression
- Reconstruction of a high-dimensional low-rank matrix
- Reconstruction of a low-rank matrix in the presence of Gaussian noise
- Lasso guarantees for \(\beta \)-mixing heavy-tailed time series
- Estimation of a sparse and spiked covariance matrix
- Recursive estimation for ordered eigenvectors of symmetric matrix with observation noise
- Optimal large-scale quantum state tomography with Pauli measurements
- Matrix estimation by universal singular value thresholding
- Parametric and semiparametric reduced-rank regression with flexible sparsity
- Learning Markov models via low-rank optimization
- Von Neumann entropy penalization and low-rank matrix estimation
- Regularized high dimension low tubal-rank tensor regression
- Noisy low-rank matrix completion with general sampling distribution
- Multistage convex relaxation approach to rank regularized minimization problems based on equivalent mathematical program with a generalized complementarity constraint
- Robust low-rank matrix estimation
- Multivariate response regression with low-rank and generalized sparsity
- Regularized estimation and testing for high-dimensional multi-block vector-autoregressive models
- Regularization parameter selection for the low rank matrix recovery
- Sparse PCA: optimal rates and adaptive estimation
- Matrix completion via max-norm constrained optimization
- Reduced-rank estimation for ill-conditioned stochastic linear model with high signal-to-noise ratio
- On model selection consistency of regularized M-estimators
- Estimation of Low Rank High-Dimensional Multivariate Linear Models for Multi-Response Data
- Maximum likelihood estimation of sparse networks with missing observations
- Iterative reconstruction of rank-one matrices in noise
- Spectral analysis of high-dimensional time series
- Estimation of low rank density matrices: bounds in Schatten norms and other distances
- Consistency of trace norm minimization
- High-dimensional estimation of quadratic variation based on penalized realized variance
- Inference for low-rank models
- Rank penalized estimators for high-dimensional matrices
- Regularized estimation in sparse high-dimensional time series models
- A perturbation inequality for concave functions of singular values and its applications in low-rank matrix recovery
- Leveraging mixed and incomplete outcomes via reduced-rank modeling
- ROP: matrix recovery via rank-one projections
- High dimensional matrix estimation with unknown variance of the noise
- Orthogonal rank-one matrix pursuit for low rank matrix completion
- Tackling Small Eigen-Gaps: Fine-Grained Eigenvector Estimation and Inference Under Heteroscedastic Noise
- Asymptotic equivalence of quantum state tomography and noisy matrix completion
- Factor Models for High-Dimensional Tensor Time Series
- Fast global convergence of gradient methods for high-dimensional statistical recovery
- Autoregressive models for matrix-valued time series
- Matrix completion with nonconvex regularization: spectral operators and scalable algorithms
- Geometric median and robust estimation in Banach spaces
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- Projected principal component analysis in factor models
- Adaptive estimation of the copula correlation matrix for semiparametric elliptical copulas
- Statistical Inference for High-Dimensional Vector Autoregression with Measurement Error
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- Optimal selection of reduced rank estimators of high-dimensional matrices
- Estimation of high-dimensional low-rank matrices
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- Joint variable and rank selection for parsimonious estimation of high-dimensional matrices
- Sparse reduced-rank regression with covariance estimation
- Cross: efficient low-rank tensor completion
- Calmness of partial perturbation to composite rank constraint systems and its applications
- High-dimensional regression with unknown variance
- Degrees of freedom in low rank matrix estimation
- Supervised Principal Component Regression for Functional Responses with High Dimensional Predictors
- D4R: doubly robust reduced rank regression in high dimension
- Entrywise Estimation of Singular Vectors of Low-Rank Matrices With Heteroskedasticity and Dependence
- High-dimensional dynamic systems identification with additional constraints
- Fast Estimation of Approximate Matrix Ranks Using Spectral Densities
- Matrix Completion Methods for Causal Panel Data Models
- Low-rank matrix recovery under heavy-tailed errors
- Simultaneous Decorrelation of Matrix Time Series
- Matrix denoising for weighted loss functions and heterogeneous signals
- Perturbative construction of mean-field equations in extensive-rank matrix factorization and denoising
- Regularized estimation of high-dimensional factor-augmented vector autoregressive (FAVAR) models
- Title not available (Why is that?)
- Mining the factor zoo: estimation of latent factor models with sufficient proxies
- High-dimensional low-rank tensor autoregressive time series modeling
- Isolated calmness of solution mappings and exact recovery conditions for nuclear norm optimization problems
- Low-rank diffusion matrix estimation for high-dimensional time-changed Lévy processes
- An Exact and Robust Conformal Inference Method for Counterfactual and Synthetic Controls
- Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach
- Double fused Lasso penalized LAD for matrix regression
- Forecasting mortality with international linkages: a global vector-autoregression approach
- Double fused Lasso regularized regression with both matrix and vector valued predictors
- Robust matrix estimations meet Frank-Wolfe algorithm
- Hierarchical nuclear norm penalization for multi-view data integration
- Sparse and Low-Rank Matrix Quantile Estimation With Application to Quadratic Regression
- Exponential weights in multivariate regression and a low-rankness favoring prior
- Title not available (Why is that?)
- Collective matrix completion
- Bootstrap-based regularization for low-rank matrix estimation
- High-dimensional latent panel quantile regression with an application to asset pricing
- Adaptive estimation of the rank of the coefficient matrix in high-dimensional multivariate response regression models
- Dynamic assortment personalization in high dimensions
- Equivalent Lipschitz surrogates for zero-norm and rank optimization problems
- Imputed quantile tensor regression for near-sited spatial-temporal data
- Column \(\ell_{2,0}\)-norm regularized factorization model of low-rank matrix recovery and its computation
- High-dimensional VAR with low-rank transition
Uses Software
This page was built for publication: Estimation of (near) low-rank matrices with noise and high-dimensional scaling
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q548547)