Estimation of (near) low-rank matrices with noise and high-dimensional scaling
From MaRDI portal
Publication:548547
DOI10.1214/10-AOS850zbMath1216.62090arXiv0912.5100OpenAlexW2586353914MaRDI QIDQ548547
Martin J. Wainwright, Sahand N. Negahban
Publication date: 29 June 2011
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0912.5100
random matrix theoryhigh-dimensional inferencenuclear normtrace norm\(M\)-estimatorsrank constraints
Related Items
Adaptive estimation of the copula correlation matrix for semiparametric elliptical copulas ⋮ High-Dimensional Vector Autoregressive Time Series Modeling via Tensor Decomposition ⋮ Matrix Completion Methods for Causal Panel Data Models ⋮ An Exact and Robust Conformal Inference Method for Counterfactual and Synthetic Controls ⋮ A shrinkage principle for heavy-tailed data: high-dimensional robust low-rank matrix recovery ⋮ Estimation of Low Rank High-Dimensional Multivariate Linear Models for Multi-Response Data ⋮ Reconstruction of a high-dimensional low-rank matrix ⋮ Low-rank diffusion matrix estimation for high-dimensional time-changed Lévy processes ⋮ Optimal large-scale quantum state tomography with Pauli measurements ⋮ Generalized co-sparse factor regression ⋮ Regularized high dimension low tubal-rank tensor regression ⋮ Matrix completion via max-norm constrained optimization ⋮ Sparse reduced-rank regression with covariance estimation ⋮ Estimation of low rank density matrices: bounds in Schatten norms and other distances ⋮ Iterative Collaborative Filtering for Sparse Matrix Estimation ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Trace regression model with simultaneously low rank and row(column) sparse parameter ⋮ Bayesian sparse reduced rank multivariate regression ⋮ Unnamed Item ⋮ Multivariate factorizable expectile regression with application to fMRI data ⋮ Column $\ell_{2,0}$-Norm Regularized Factorization Model of Low-Rank Matrix Recovery and Its Computation ⋮ Optimal prediction in the linearly transformed spiked model ⋮ Exponential weights in multivariate regression and a low-rankness favoring prior ⋮ High-dimensional dynamic systems identification with additional constraints ⋮ Bayesian singular value regularization via a cumulative shrinkage process ⋮ Reconstruction of a low-rank matrix in the presence of Gaussian noise ⋮ Convex relaxation algorithm for a structured simultaneous low-rank and sparse recovery problem ⋮ Regular and limiting normal cones to the graph of the subdifferential mapping of the nuclear norm ⋮ Multistage Convex Relaxation Approach to Rank Regularized Minimization Problems Based on Equivalent Mathematical Program with a Generalized Complementarity Constraint ⋮ Random perturbation of low rank matrices: improving classical bounds ⋮ Learning Markov Models Via Low-Rank Optimization ⋮ Unnamed Item ⋮ Robust multicategory support matrix machines ⋮ Matrix completion with nonconvex regularization: spectral operators and scalable algorithms ⋮ High-dimensional VAR with low-rank transition ⋮ Lasso guarantees for \(\beta \)-mixing heavy-tailed time series ⋮ On signal detection and confidence sets for low rank inference problems ⋮ High-dimensional VARs with common factors ⋮ Large dimensional latent factor modeling with missing observations and applications to causal inference ⋮ A perturbation inequality for concave functions of singular values and its applications in low-rank matrix recovery ⋮ Calmness of partial perturbation to composite rank constraint systems and its applications ⋮ Factor Models for High-Dimensional Tensor Time Series ⋮ Low rank multivariate regression ⋮ Rank penalized estimators for high-dimensional matrices ⋮ Asymptotic equivalence of quantum state tomography and noisy matrix completion ⋮ Noisy matrix decomposition via convex relaxation: optimal rates in high dimensions ⋮ Heterogeneity adjustment with applications to graphical model inference ⋮ Sparse PCA: optimal rates and adaptive estimation ⋮ Noisy low-rank matrix completion with general sampling distribution ⋮ A multi-stage convex relaxation approach to noisy structured low-rank matrix recovery ⋮ Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion ⋮ Median-Truncated Gradient Descent: A Robust and Scalable Nonconvex Approach for Signal Estimation ⋮ Cross: efficient low-rank tensor completion ⋮ Several Classes of Stationary Points for Rank Regularized Minimization Problems ⋮ Forecasting mortality with international linkages: a global vector-autoregression approach ⋮ Maximum likelihood estimation of sparse networks with missing observations ⋮ Robust low-rank matrix estimation ⋮ On the degrees of freedom of mixed matrix regression ⋮ Matrix Denoising for Weighted Loss Functions and Heterogeneous Signals ⋮ Equivalent Lipschitz surrogates for zero-norm and rank optimization problems ⋮ Sparse distance metric learning ⋮ Sparse trace norm regularization ⋮ Double fused Lasso penalized LAD for matrix regression ⋮ Estimation of high-dimensional low-rank matrices ⋮ Estimation of (near) low-rank matrices with noise and high-dimensional scaling ⋮ Optimal selection of reduced rank estimators of high-dimensional matrices ⋮ Parametric and semiparametric reduced-rank regression with flexible sparsity ⋮ Leveraging mixed and incomplete outcomes via reduced-rank modeling ⋮ Autoregressive models for matrix-valued time series ⋮ Regularized estimation in sparse high-dimensional time series models ⋮ Geometric median and robust estimation in Banach spaces ⋮ Adaptive estimation of the rank of the coefficient matrix in high-dimensional multivariate response regression models ⋮ Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach ⋮ Statistical Inference for High-Dimensional Vector Autoregression with Measurement Error ⋮ FACTORISABLE MULTITASK QUANTILE REGRESSION ⋮ On the exponentially weighted aggregate with the Laplace prior ⋮ Double fused Lasso regularized regression with both matrix and vector valued predictors ⋮ Linear Models Based on Noisy Data and the Frisch Scheme ⋮ Joint variable and rank selection for parsimonious estimation of high-dimensional matrices ⋮ Fast global convergence of gradient methods for high-dimensional statistical recovery ⋮ Regularization parameter selection for the low rank matrix recovery ⋮ Low-rank elastic-net regularized multivariate Huber regression model ⋮ Error bound of critical points and KL property of exponent 1/2 for squared F-norm regularized factorization ⋮ A semiparametric model for matrix regression ⋮ Provable accelerated gradient method for nonconvex low rank optimization ⋮ Dynamic Assortment Personalization in High Dimensions ⋮ Intelligent Initialization and Adaptive Thresholding for Iterative Matrix Completion: Some Statistical and Algorithmic Theory forAdaptive-Impute ⋮ Isolated calmness of solution mappings and exact recovery conditions for nuclear norm optimization problems ⋮ Unnamed Item ⋮ Quantile trace regression via nuclear norm regularization ⋮ Generalized high-dimensional trace regression via nuclear norm regularization ⋮ Spectral analysis of high-dimensional time series ⋮ Multivariate response regression with low-rank and generalized sparsity ⋮ Orthogonal Rank-One Matrix Pursuit for Low Rank Matrix Completion ⋮ ROP: matrix recovery via rank-one projections ⋮ Matrix estimation by universal singular value thresholding ⋮ Estimation of a sparse and spiked covariance matrix ⋮ On model selection consistency of regularized M-estimators ⋮ Unnamed Item ⋮ High-dimensional latent panel quantile regression with an application to asset pricing ⋮ A new perspective on low-rank optimization ⋮ Multiple change points detection in high-dimensional multivariate regression ⋮ The EAS approach for graphical selection consistency in vector autoregression models ⋮ Oracle inequality for sparse trace regression models with exponential \(\beta\)-mixing errors ⋮ Robust Recovery of Low-Rank Matrices and Low-Tubal-Rank Tensors from Noisy Sketches ⋮ The rate of convergence for sparse and low-rank quantile trace regression ⋮ Robust Recommendation via Social Network Enhanced Matrix Completion ⋮ Sparse and Low-Rank Matrix Quantile Estimation With Application to Quadratic Regression ⋮ Sparse estimation in high-dimensional linear errors-in-variables regression via a covariate relaxation method ⋮ A framework of regularized low-rank matrix models for regression and classification ⋮ Estimation and inference in a high-dimensional semiparametric Gaussian copula vector autoregressive model ⋮ Sparse reduced-rank regression for simultaneous rank and variable selection via manifold optimization ⋮ Profile GMM estimation of panel data models with interactive fixed effects ⋮ Large volatility matrix analysis using global and national factor models ⋮ Rate-optimal robust estimation of high-dimensional vector autoregressive models ⋮ Universal Features for High-Dimensional Learning and Inference ⋮ Robust matrix estimations meet Frank-Wolfe algorithm ⋮ Mining the factor zoo: estimation of latent factor models with sufficient proxies ⋮ High-dimensional low-rank tensor autoregressive time series modeling ⋮ Low-Rank Regression Models for Multiple Binary Responses and their Applications to Cancer Cell-Line Encyclopedia Data ⋮ High-dimensional estimation of quadratic variation based on penalized realized variance ⋮ Imputed quantile tensor regression for near-sited spatial-temporal data ⋮ Inference for low-rank models ⋮ Low-rank matrix estimation via nonconvex optimization methods in multi-response errors-in-variables regression ⋮ Expectile trace regression via low-rank and group sparsity regularization ⋮ Distributed estimation and inference for spatial autoregression model with large scale networks ⋮ Projected principal component analysis in factor models ⋮ High-dimensional regression with unknown variance ⋮ A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Sparse inverse covariance estimation with the graphical lasso
- Estimation of high-dimensional low-rank matrices
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- Null space conditions and thresholds for rank minimization
- High-dimensional analysis of semidefinite relaxations for sparse principal components
- Regularization in statistics
- The benefit of group sparsity
- Covariance regularization by thresholding
- Operator norm consistent estimation of large-dimensional sparse covariance matrices
- Sparse recovery problems in high dimensions: statistical inference and learning theory. Abstracts from the mini-workshop held March 15th -- March 21st, 2009.
- On the distribution of the largest eigenvalue in principal components analysis
- Sparse permutation invariant covariance estimation
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- Simultaneous analysis of Lasso and Dantzig selector
- Support union recovery in high-dimensional multivariate regression
- Regularized estimation of large covariance matrices
- High-dimensional graphs and variable selection with the Lasso
- Exact matrix completion via convex optimization
- Compressed sensing and best 𝑘-term approximation
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- Model selection and estimation in the Gaussian graphical model
- Decoding by Linear Programming
- Interior-Point Method for Nuclear Norm Approximation with Application to System Identification
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- Matrix Analysis
- Atomic Decomposition by Basis Pursuit
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Semidefinite Programming
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Minimax Rates of Estimation for High-Dimensional Linear Regression Over $\ell_q$-Balls
- Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
- Restricted strong convexity and weighted matrix completion: Optimal bounds with noise
- Model Selection and Estimation in Regression with Grouped Variables
- Compressed sensing
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers