On the conditional distributions of low-dimensional projections from high-dimensional data
From MaRDI portal
Publication:355082
DOI10.1214/12-AOS1081zbMATH Open1360.62371arXiv1304.5943OpenAlexW2049955185MaRDI QIDQ355082FDOQ355082
Authors: Hannes Leeb
Publication date: 24 July 2013
Published in: The Annals of Statistics (Search for Journal in Brave)
Abstract: We study the conditional distribution of low-dimensional projections from high-dimensional data, where the conditioning is on other low-dimensional projections. To fix ideas, consider a random d-vector Z that has a Lebesgue density and that is standardized so that and . Moreover, consider two projections defined by unit-vectors and , namely a response and an explanatory variable . It has long been known that the conditional mean of y given x is approximately linear in xalpha�eta$'s, provided only that the dimension of Z is large. In that sense, we see that most linear submodels of a high-dimensional overall model are approximately correct. Our findings provide new insights in a variety of modeling scenarios. We discuss several examples, including sliced inverse regression, sliced average variance estimation, generalized linear models under potential link violation, and sparse linear modeling.
Full work available at URL: https://arxiv.org/abs/1304.5943
Recommendations
- On almost linearity of low dimensional projections from high dimensional data
- Approximation of projections of random vectors
- On conditional moments of high-dimensional random vectors given lower-dimensional projections
- On Sliced Inverse Regression With High-Dimensional Covariates
- On low-dimensional projections of high-dimensional distributions
Multivariate analysis (62H99) Asymptotic distribution theory in statistics (62E20) Estimation in multivariate analysis (62H12)
Cites Work
- Sliced Inverse Regression for Dimension Reduction
- On almost linearity of low dimensional projections from high dimensional data
- Real Analysis and Probability
- The Large-Sample Power of Tests Based on Permutations of Observations
- Regression analysis under link violation
- Sufficient dimension reduction in regressions with categorical predictors
- Dimension reduction for non-elliptically distributed predictors: second-order methods
- On Directional Regression for Dimension Reduction
- Asymptotics of graphical projection pursuit
- On Principal Hessian Directions for Data Visualization and Dimension Reduction: Another Application of Stein's Lemma
- The normal distribution. Characterizations with applications
- Principal support vector machines for linear and nonlinear sufficient dimension reduction
- On low-dimensional projections of high-dimensional distributions
- A characterization of spherical distributions
- Nonlinear confounding in high-dimensional regression
- On the conditional distributions of low-dimensional projections from high-dimensional data
- Coordinate-independent sparse sufficient dimension reduction and variable selection
Cited In (7)
- A Random Projection Approach to Hypothesis Tests in High-Dimensional Single-Index Models
- STATISTICAL INFERENCE WITH F-STATISTICS WHEN FITTING SIMPLE MODELS TO HIGH-DIMENSIONAL DATA
- Projective inference in high-dimensional problems: prediction and feature selection
- On conditional moments of high-dimensional random vectors given lower-dimensional projections
- An Inverse-regression Method of Dependent Variable Transformation for Dimension Reduction with Non-linear Confounding
- On the conditional distributions of low-dimensional projections from high-dimensional data
- On almost linearity of low dimensional projections from high dimensional data
This page was built for publication: On the conditional distributions of low-dimensional projections from high-dimensional data
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q355082)