Estimation of (near) low-rank matrices with noise and high-dimensional scaling
From MaRDI portal
(Redirected from Publication:548547)
Abstract: High-dimensional inference refers to problems of statistical estimation in which the ambient dimension of the data may be comparable to or possibly even larger than the sample size. We study an instance of high-dimensional inference in which the goal is to estimate a matrix on the basis of noisy observations, and the unknown matrix is assumed to be either exactly low rank, or ``near low-rank, meaning that it can be well-approximated by a matrix with low rank. We consider an -estimator based on regularization by the trace or nuclear norm over matrices, and analyze its performance under high-dimensional scaling. We provide non-asymptotic bounds on the Frobenius norm error that hold for a general class of noisy observation models, and then illustrate their consequences for a number of specific matrix models, including low-rank multivariate or multi-task regression, system identification in vector autoregressive processes, and recovery of low-rank matrices from random projections. Simulation results show excellent agreement with the high-dimensional scaling of the error predicted by our theory.
Recommendations
- Estimation of high-dimensional low-rank matrices
- High dimensional matrix estimation with unknown variance of the noise
- Asymptotic theory for estimating the singular vectors and values of a partially-observed low rank matrix with noise
- Robust low-rank matrix estimation
- Low-rank Matrix Recovery from Noisy, Quantized and Erroneous Measurements
- Reconstruction of a low-rank matrix in the presence of Gaussian noise
- Near-optimal estimation of simultaneously sparse and low-rank matrices from nested linear measurements
- Estimation of low-rank matrices via approximate message passing
- Rank penalized estimators for high-dimensional matrices
- A unified approach to estimating a normal mean matrix in high and low dimensions
Cites work
- scientific article; zbMATH DE number 194139 (Why is no real title available?)
- scientific article; zbMATH DE number 2107836 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- scientific article; zbMATH DE number 3357844 (Why is no real title available?)
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- Atomic Decomposition by Basis Pursuit
- Compressed sensing
- Compressed sensing and best \(k\)-term approximation
- Consistency of trace norm minimization
- Covariance regularization by thresholding
- Decoding by Linear Programming
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- Estimation of high-dimensional low-rank matrices
- Exact matrix completion via convex optimization
- Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization
- High-dimensional analysis of semidefinite relaxations for sparse principal components
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- High-dimensional graphs and variable selection with the Lasso
- Interior-point method for nuclear norm approximation with application to system identification
- Matrix Analysis
- Matrix completion from noisy entries
- Minimax Rates of Estimation for High-Dimensional Linear Regression Over $\ell_q$-Balls
- Model Selection and Estimation in Regression with Grouped Variables
- Model selection and estimation in the Gaussian graphical model
- Null space conditions and thresholds for rank minimization
- On the distribution of the largest eigenvalue in principal components analysis
- Operator norm consistent estimation of large-dimensional sparse covariance matrices
- Regularization in statistics
- Regularized estimation of large covariance matrices
- Restricted strong convexity and weighted matrix completion: optimal bounds with noise
- Semidefinite Programming
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Simultaneous analysis of Lasso and Dantzig selector
- Sparse inverse covariance estimation with the graphical lasso
- Sparse permutation invariant covariance estimation
- Sparse recovery problems in high dimensions: statistical inference and learning theory. Abstracts from the mini-workshop held March 15th -- March 21st, 2009.
- Support union recovery in high-dimensional multivariate regression
- The benefit of group sparsity
- Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
Cited in
(only showing first 100 items - show all)- A perturbation inequality for concave functions of singular values and its applications in low-rank matrix recovery
- Error bound of critical points and KL property of exponent 1/2 for squared F-norm regularized factorization
- Leveraging mixed and incomplete outcomes via reduced-rank modeling
- The rate of convergence for sparse and low-rank quantile trace regression
- A framework of regularized low-rank matrix models for regression and classification
- Sparse estimation in high-dimensional linear errors-in-variables regression via a covariate relaxation method
- Comment: Feature Screening and Variable Selection via Iterative Ridge Regression
- Low-Rank Regression Models for Multiple Binary Responses and their Applications to Cancer Cell-Line Encyclopedia Data
- ROP: matrix recovery via rank-one projections
- On the exponentially weighted aggregate with the Laplace prior
- Sparse trace norm regularization
- High dimensional matrix estimation with unknown variance of the noise
- A new perspective on low-rank optimization
- Asymptotic equivalence of quantum state tomography and noisy matrix completion
- Tackling Small Eigen-Gaps: Fine-Grained Eigenvector Estimation and Inference Under Heteroscedastic Noise
- Orthogonal rank-one matrix pursuit for low rank matrix completion
- Several classes of stationary points for rank regularized minimization problems
- A semiparametric model for matrix regression
- Bayesian sparse reduced rank multivariate regression
- Fast global convergence of gradient methods for high-dimensional statistical recovery
- Optimal prediction in the linearly transformed spiked model
- Iterative Collaborative Filtering for Sparse Matrix Estimation
- Factor Models for High-Dimensional Tensor Time Series
- Random perturbation of low rank matrices: improving classical bounds
- Intelligent Initialization and Adaptive Thresholding for Iterative Matrix Completion: Some Statistical and Algorithmic Theory forAdaptive-Impute
- Dynamic Matrix Recovery
- Autoregressive models for matrix-valued time series
- High-dimensional VARs with common factors
- Large dimensional latent factor modeling with missing observations and applications to causal inference
- Geometric median and robust estimation in Banach spaces
- Quantile trace regression via nuclear norm regularization
- Matrix completion with nonconvex regularization: spectral operators and scalable algorithms
- Adaptive estimation of the copula correlation matrix for semiparametric elliptical copulas
- Sparse reduced-rank regression for simultaneous rank and variable selection via manifold optimization
- Scalable interpretable multi-response regression via SEED
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- Projected principal component analysis in factor models
- Statistical Inference for High-Dimensional Vector Autoregression with Measurement Error
- Profile GMM estimation of panel data models with interactive fixed effects
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- Optimal selection of reduced rank estimators of high-dimensional matrices
- Regular and limiting normal cones to the graph of the subdifferential mapping of the nuclear norm
- Scenario-based quantile connectedness of the U.S. interbank liquidity risk network
- Nuclear norm regularized quantile regression with interactive fixed effects
- Robust Recommendation via Social Network Enhanced Matrix Completion
- Estimation of high-dimensional low-rank matrices
- Large volatility matrix analysis using global and national factor models
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- Joint variable and rank selection for parsimonious estimation of high-dimensional matrices
- Sparse reduced-rank regression with covariance estimation
- Cross: efficient low-rank tensor completion
- Heterogeneity adjustment with applications to graphical model inference
- Calmness of partial perturbation to composite rank constraint systems and its applications
- Median-truncated gradient descent: a robust and scalable nonconvex approach for signal estimation
- Robust multicategory support matrix machines
- Degrees of freedom in low rank matrix estimation
- On signal detection and confidence sets for low rank inference problems
- A shrinkage principle for heavy-tailed data: high-dimensional robust low-rank matrix recovery
- High-dimensional regression with unknown variance
- Universal Features for High-Dimensional Learning and Inference
- Multiple change points detection in high-dimensional multivariate regression
- Entrywise Estimation of Singular Vectors of Low-Rank Matrices With Heteroskedasticity and Dependence
- Supervised Principal Component Regression for Functional Responses with High Dimensional Predictors
- D4R: doubly robust reduced rank regression in high dimension
- Low rank multivariate regression
- Generalized co-sparse factor regression
- High-dimensional dynamic systems identification with additional constraints
- Fast Estimation of Approximate Matrix Ranks Using Spectral Densities
- Convex relaxation algorithm for a structured simultaneous low-rank and sparse recovery problem
- A multi-stage convex relaxation approach to noisy structured low-rank matrix recovery
- Noisy matrix decomposition via convex relaxation: optimal rates in high dimensions
- Matrix Completion Methods for Causal Panel Data Models
- On the degrees of freedom of mixed matrix regression
- Low-rank matrix recovery under heavy-tailed errors
- Matrix denoising for weighted loss functions and heterogeneous signals
- Simultaneous Decorrelation of Matrix Time Series
- Reconstruction of a high-dimensional low-rank matrix
- Perturbative construction of mean-field equations in extensive-rank matrix factorization and denoising
- Reconstruction of a low-rank matrix in the presence of Gaussian noise
- Regularized estimation of high-dimensional factor-augmented vector autoregressive (FAVAR) models
- Lasso guarantees for \(\beta \)-mixing heavy-tailed time series
- Low-rank diffusion matrix estimation for high-dimensional time-changed Lévy processes
- scientific article; zbMATH DE number 7370581 (Why is no real title available?)
- Mining the factor zoo: estimation of latent factor models with sufficient proxies
- High-dimensional low-rank tensor autoregressive time series modeling
- Estimation of a sparse and spiked covariance matrix
- Isolated calmness of solution mappings and exact recovery conditions for nuclear norm optimization problems
- Optimal large-scale quantum state tomography with Pauli measurements
- Recursive estimation for ordered eigenvectors of symmetric matrix with observation noise
- An Exact and Robust Conformal Inference Method for Counterfactual and Synthetic Controls
- Parametric and semiparametric reduced-rank regression with flexible sparsity
- Matrix estimation by universal singular value thresholding
- Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach
- Double fused Lasso penalized LAD for matrix regression
- Forecasting mortality with international linkages: a global vector-autoregression approach
- Double fused Lasso regularized regression with both matrix and vector valued predictors
- Von Neumann entropy penalization and low-rank matrix estimation
- Learning Markov models via low-rank optimization
- Exponential weights in multivariate regression and a low-rankness favoring prior
- Regularized high dimension low tubal-rank tensor regression
This page was built for publication: Estimation of (near) low-rank matrices with noise and high-dimensional scaling
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q548547)