Estimation of (near) low-rank matrices with noise and high-dimensional scaling
From MaRDI portal
(Redirected from Publication:548547)
Abstract: High-dimensional inference refers to problems of statistical estimation in which the ambient dimension of the data may be comparable to or possibly even larger than the sample size. We study an instance of high-dimensional inference in which the goal is to estimate a matrix on the basis of noisy observations, and the unknown matrix is assumed to be either exactly low rank, or ``near low-rank, meaning that it can be well-approximated by a matrix with low rank. We consider an -estimator based on regularization by the trace or nuclear norm over matrices, and analyze its performance under high-dimensional scaling. We provide non-asymptotic bounds on the Frobenius norm error that hold for a general class of noisy observation models, and then illustrate their consequences for a number of specific matrix models, including low-rank multivariate or multi-task regression, system identification in vector autoregressive processes, and recovery of low-rank matrices from random projections. Simulation results show excellent agreement with the high-dimensional scaling of the error predicted by our theory.
Recommendations
- Estimation of high-dimensional low-rank matrices
- High dimensional matrix estimation with unknown variance of the noise
- Asymptotic theory for estimating the singular vectors and values of a partially-observed low rank matrix with noise
- Robust low-rank matrix estimation
- Low-rank Matrix Recovery from Noisy, Quantized and Erroneous Measurements
- Reconstruction of a low-rank matrix in the presence of Gaussian noise
- Near-optimal estimation of simultaneously sparse and low-rank matrices from nested linear measurements
- Estimation of low-rank matrices via approximate message passing
- Rank penalized estimators for high-dimensional matrices
- A unified approach to estimating a normal mean matrix in high and low dimensions
Cites work
- scientific article; zbMATH DE number 194139 (Why is no real title available?)
- scientific article; zbMATH DE number 2107836 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- scientific article; zbMATH DE number 3357844 (Why is no real title available?)
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- Atomic Decomposition by Basis Pursuit
- Compressed sensing
- Compressed sensing and best \(k\)-term approximation
- Consistency of trace norm minimization
- Covariance regularization by thresholding
- Decoding by Linear Programming
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- Estimation of high-dimensional low-rank matrices
- Exact matrix completion via convex optimization
- Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization
- High-dimensional analysis of semidefinite relaxations for sparse principal components
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- High-dimensional graphs and variable selection with the Lasso
- Interior-point method for nuclear norm approximation with application to system identification
- Matrix Analysis
- Matrix completion from noisy entries
- Minimax Rates of Estimation for High-Dimensional Linear Regression Over $\ell_q$-Balls
- Model Selection and Estimation in Regression with Grouped Variables
- Model selection and estimation in the Gaussian graphical model
- Null space conditions and thresholds for rank minimization
- On the distribution of the largest eigenvalue in principal components analysis
- Operator norm consistent estimation of large-dimensional sparse covariance matrices
- Regularization in statistics
- Regularized estimation of large covariance matrices
- Restricted strong convexity and weighted matrix completion: optimal bounds with noise
- Semidefinite Programming
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Simultaneous analysis of Lasso and Dantzig selector
- Sparse inverse covariance estimation with the graphical lasso
- Sparse permutation invariant covariance estimation
- Sparse recovery problems in high dimensions: statistical inference and learning theory. Abstracts from the mini-workshop held March 15th -- March 21st, 2009.
- Support union recovery in high-dimensional multivariate regression
- The benefit of group sparsity
- Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
Cited in
(only showing first 100 items - show all)- On signal detection and confidence sets for low rank inference problems
- A shrinkage principle for heavy-tailed data: high-dimensional robust low-rank matrix recovery
- High-dimensional regression with unknown variance
- Universal Features for High-Dimensional Learning and Inference
- Multiple change points detection in high-dimensional multivariate regression
- Entrywise Estimation of Singular Vectors of Low-Rank Matrices With Heteroskedasticity and Dependence
- Supervised Principal Component Regression for Functional Responses with High Dimensional Predictors
- D4R: doubly robust reduced rank regression in high dimension
- Low rank multivariate regression
- Generalized co-sparse factor regression
- High-dimensional dynamic systems identification with additional constraints
- Fast Estimation of Approximate Matrix Ranks Using Spectral Densities
- Convex relaxation algorithm for a structured simultaneous low-rank and sparse recovery problem
- A multi-stage convex relaxation approach to noisy structured low-rank matrix recovery
- Noisy matrix decomposition via convex relaxation: optimal rates in high dimensions
- Matrix Completion Methods for Causal Panel Data Models
- On the degrees of freedom of mixed matrix regression
- Low-rank matrix recovery under heavy-tailed errors
- Matrix denoising for weighted loss functions and heterogeneous signals
- Simultaneous Decorrelation of Matrix Time Series
- Reconstruction of a high-dimensional low-rank matrix
- Perturbative construction of mean-field equations in extensive-rank matrix factorization and denoising
- Reconstruction of a low-rank matrix in the presence of Gaussian noise
- Regularized estimation of high-dimensional factor-augmented vector autoregressive (FAVAR) models
- Lasso guarantees for \(\beta \)-mixing heavy-tailed time series
- Low-rank diffusion matrix estimation for high-dimensional time-changed Lévy processes
- scientific article; zbMATH DE number 7370581 (Why is no real title available?)
- Mining the factor zoo: estimation of latent factor models with sufficient proxies
- High-dimensional low-rank tensor autoregressive time series modeling
- Estimation of a sparse and spiked covariance matrix
- Isolated calmness of solution mappings and exact recovery conditions for nuclear norm optimization problems
- Optimal large-scale quantum state tomography with Pauli measurements
- Recursive estimation for ordered eigenvectors of symmetric matrix with observation noise
- An Exact and Robust Conformal Inference Method for Counterfactual and Synthetic Controls
- Parametric and semiparametric reduced-rank regression with flexible sparsity
- Matrix estimation by universal singular value thresholding
- Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach
- Double fused Lasso penalized LAD for matrix regression
- Forecasting mortality with international linkages: a global vector-autoregression approach
- Double fused Lasso regularized regression with both matrix and vector valued predictors
- Von Neumann entropy penalization and low-rank matrix estimation
- Learning Markov models via low-rank optimization
- Exponential weights in multivariate regression and a low-rankness favoring prior
- Regularized high dimension low tubal-rank tensor regression
- Robust matrix estimations meet Frank-Wolfe algorithm
- Hierarchical nuclear norm penalization for multi-view data integration
- Sparse and Low-Rank Matrix Quantile Estimation With Application to Quadratic Regression
- scientific article; zbMATH DE number 125135 (Why is no real title available?)
- Noisy low-rank matrix completion with general sampling distribution
- Robust low-rank matrix estimation
- Collective matrix completion
- Bootstrap-based regularization for low-rank matrix estimation
- Multivariate response regression with low-rank and generalized sparsity
- Multistage convex relaxation approach to rank regularized minimization problems based on equivalent mathematical program with a generalized complementarity constraint
- Adaptive estimation of the rank of the coefficient matrix in high-dimensional multivariate response regression models
- Equivalent Lipschitz surrogates for zero-norm and rank optimization problems
- Regularized estimation and testing for high-dimensional multi-block vector-autoregressive models
- Dynamic assortment personalization in high dimensions
- High-dimensional latent panel quantile regression with an application to asset pricing
- Regularization parameter selection for the low rank matrix recovery
- High-dimensional VAR with low-rank transition
- Matrix completion via max-norm constrained optimization
- Sparse PCA: optimal rates and adaptive estimation
- Reduced-rank estimation for ill-conditioned stochastic linear model with high signal-to-noise ratio
- Column \(\ell_{2,0}\)-norm regularized factorization model of low-rank matrix recovery and its computation
- Imputed quantile tensor regression for near-sited spatial-temporal data
- High-Dimensional Vector Autoregressive Time Series Modeling via Tensor Decomposition
- Bayesian singular value regularization via a cumulative shrinkage process
- On model selection consistency of regularized M-estimators
- Estimation and inference in a high-dimensional semiparametric Gaussian copula vector autoregressive model
- Maximum likelihood estimation of sparse networks with missing observations
- Estimation of Low Rank High-Dimensional Multivariate Linear Models for Multi-Response Data
- Provable accelerated gradient method for nonconvex low rank optimization
- The EAS approach for graphical selection consistency in vector autoregression models
- Estimation of low rank density matrices: bounds in Schatten norms and other distances
- Spectral analysis of high-dimensional time series
- Iterative reconstruction of rank-one matrices in noise
- Sparse distance metric learning
- Consistency of trace norm minimization
- Low-rank matrix estimation via nonconvex optimization methods in multi-response errors-in-variables regression
- Oracle inequality for sparse trace regression models with exponential \(\beta\)-mixing errors
- Robust Recovery of Low-Rank Matrices and Low-Tubal-Rank Tensors from Noisy Sketches
- Factorisable multitask quantile regression
- Linear models based on noisy data and the Frisch scheme
- Matrix estimation, latent variable model and collaborative filtering
- High-dimensional estimation of quadratic variation based on penalized realized variance
- Trace regression model with simultaneously low rank and row(column) sparse parameter
- Multivariate factorizable expectile regression with application to fMRI data
- Inference for low-rank models
- Expectile trace regression via low-rank and group sparsity regularization
- Distributed estimation and inference for spatial autoregression model with large scale networks
- Rate-optimal robust estimation of high-dimensional vector autoregressive models
- Sequential Scaled Sparse Factor Regression
- Multivariate log-contrast regression with sub-compositional predictors: testing the association between preterm infants' gut microbiome and neurobehavioral outcomes
- Reduced-Rank Envelope Vector Autoregressive Model
- Rank penalized estimators for high-dimensional matrices
- Low-rank elastic-net regularized multivariate Huber regression model
- Regularized estimation in sparse high-dimensional time series models
- High dimensional deformed rectangular matrices with applications in matrix denoising
- Generalized high-dimensional trace regression via nuclear norm regularization
This page was built for publication: Estimation of (near) low-rank matrices with noise and high-dimensional scaling
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q548547)