High-dimensional variable selection with sparse random projections: measurement sparsity and statistical efficiency
From MaRDI portal
Publication:2896148
zbMATH Open1242.62070arXiv0805.3005MaRDI QIDQ2896148FDOQ2896148
Authors: Dapo Omidiran, Martin J. Wainwright
Publication date: 13 July 2012
Published in: Journal of Machine Learning Research (JMLR) (Search for Journal in Brave)
Abstract: We consider the problem of estimating the support of a vector based on observations contaminated by noise. A significant body of work has studied behavior of -relaxations when applied to measurement matrices drawn from standard dense ensembles (e.g., Gaussian, Bernoulli). In this paper, we analyze emph{sparsified} measurement ensembles, and consider the trade-off between measurement sparsity, as measured by the fraction of non-zero entries, and the statistical efficiency, as measured by the minimal number of observations required for exact support recovery with probability converging to one. Our main result is to prove that it is possible to let at some rate, yielding measurement matrices with a vanishing fraction of non-zeros per row while retaining the same statistical efficiency as dense ensembles. A variety of simulation results confirm the sharpness of our theoretical predictions.
Full work available at URL: https://arxiv.org/abs/0805.3005
Recommendations
- Variable selection in high-dimension with random designs and orthogonal matching pursuit
- Variable selection in sparse regression with quadratic measurements
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Sparse covariance thresholding for high-dimensional variable selection
- Near-ideal model selection by \(\ell _{1}\) minimization
Cited In (9)
- Projective inference in high-dimensional problems: prediction and feature selection
- Generalized Sparse Precision Matrix Selection for Fitting Multivariate Gaussian Random Fields to Large Data Sets
- Selection of sparse vine copulas in high dimensions with the Lasso
- Dimension-wise sparse low-rank approximation of a matrix with application to variable selection in high-dimensional integrative analyzes of association
- Variable selection in high-dimension with random designs and orthogonal matching pursuit
- How can we identify the sparsity structure pattern of high-dimensional data: an elementary statistical analysis to interpretable machine learning
- A Sparse Random Projection-Based Test for Overall Qualitative Treatment Effects
- Sharp variable selection of a sparse submatrix in a high-dimensional noisy matrix
- Sparse learning for large-scale and high-dimensional data: a randomized convex-concave optimization approach
This page was built for publication: High-dimensional variable selection with sparse random projections: measurement sparsity and statistical efficiency
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2896148)