Deep nonlinear sufficient dimension reduction
DOI10.1214/24-AOS2390MaRDI QIDQ6608686FDOQ6608686
Authors: Yinfeng Chen, Yu Ling Jiao, Rui Qiu, Zhou Yu
Publication date: 20 September 2024
Published in: The Annals of Statistics (Search for Journal in Brave)
sufficient dimension reductionU-processdeep neural networksgeneralized martingale difference divergence
Nonparametric regression and quantile regression (62G08) Estimation in multivariate analysis (62H12) Prediction theory (aspects of stochastic processes) (60G25) Neural nets and related approaches to inference from stochastic processes (62M45)
Cites Work
- Efficient estimation in sufficient dimension reduction
- Measuring and testing dependence by correlation of distances
- Weak convergence and empirical processes. With applications to statistics
- Reducing the Dimensionality of Data with Neural Networks
- Determining the Dimension in Sliced Inverse Regression and Related Methods
- Sliced Inverse Regression for Dimension Reduction
- Determining the Dimensionality in Sliced Inverse Regression
- Comment
- High-dimensional statistics. A non-asymptotic viewpoint
- An Adaptive Estimation of Dimension Reduction Space
- Algorithmic Learning Theory
- A Class of Statistics with Asymptotically Normal Distribution
- Dimension reduction for nonelliptically distributed predictors
- Regression analysis under link violation
- Contour regression: a general approach to dimension reduction
- On Directional Regression for Dimension Reduction
- Optimal global rates of convergence for nonparametric regression
- Expected conditional characteristic function-based measures for testing independence
- Asymptotics for kernel estimate of sliced inverse regression
- Local Rademacher complexities
- Martingale difference correlation and its use in high-dimensional variable screening
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- Principal Hessian Directions Revisited
- Principal support vector machines for linear and nonlinear sufficient dimension reduction
- Successive direction extraction for estimating the central subspace in a multiple-index regres\-sion
- An RKHS formulation of the inverse regression dimension-reduction problem
- A general theory for nonlinear sufficient dimension reduction: formulation and estimation
- On Estimation Efficiency of the Central Mean Subspace
- Nonlinear sufficient dimension reduction for functional data
- Estimating the structural dimension of regressions via parametric inverse regression
- Title not available (Why is that?)
- Detecting independence of random vectors: generalized distance covariance and Gaussian covariance
- A new class of measures for testing independence
- Dimension Reduction Forests: Local Variable Importance Using Structured Random Forests
- Nonparametric regression using deep neural networks with ReLU activation function
- Martingale Difference Divergence Matrix and Its Application to Dimension Reduction for Stationary Multivariate Time Series
- Deep network approximation characterized by number of neurons
- Fusing sufficient dimension reduction with neural networks
- A kernel-based measure for conditional mean dependence
- Conditional variance estimator for sufficient dimension reduction
- Generalized martingale difference divergence: detecting conditional mean independence with applications in variable screening
- Title not available (Why is that?)
- Fréchet sufficient dimension reduction for random objects
- Deep dimension reduction for supervised representation learning
This page was built for publication: Deep nonlinear sufficient dimension reduction
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6608686)