Asymptotics for sliced average variance estimation
From MaRDI portal
Abstract: In this paper, we systematically study the consistency of sliced average variance estimation (SAVE). The findings reveal that when the response is continuous, the asymptotic behavior of SAVE is rather different from that of sliced inverse regression (SIR). SIR can achieve consistency even when each slice contains only two data points. However, SAVE cannot be consistent and it even turns out to be not consistent when each slice contains a fixed number of data points that do not depend on n, where n is the sample size. These results theoretically confirm the notion that SAVE is more sensitive to the number of slices than SIR. Taking this into account, a bias correction is recommended in order to allow SAVE to be consistent. In contrast, when the response is discrete and takes finite values, consistency can be achieved. Therefore, an approximation through discretization, which is commonly used in practice, is studied. A simulation study is carried out for the purposes of illustration.
Recommendations
- On kernel method for sliced average variance estimation
- On splines approximation for sliced average variance estimation
- Save: a method for dimension reduction and graphics in regression
- Dimension reduction in regressions through weighted variance estimation
- Asymptotics for kernel estimation of slicing average third-moment estimation
Cites work
- scientific article; zbMATH DE number 1713116 (Why is no real title available?)
- scientific article; zbMATH DE number 1220060 (Why is no real title available?)
- scientific article; zbMATH DE number 1932857 (Why is no real title available?)
- scientific article; zbMATH DE number 788275 (Why is no real title available?)
- An Adaptive Estimation of Dimension Reduction Space
- An asymptotic theory for sliced inverse regression
- Approximation Theorems of Mathematical Statistics
- Asymptotics for kernel estimate of sliced inverse regression
- Convergence of stochastic processes
- Dimension reduction for conditional mean in regression
- Estimating Optimal Transformations for Multiple Regression and Correlation
- Identifying Regression Outliers and Mixtures Graphically
- Model checks for regression: an innovation process approach
- Nonlinear time series. Nonparametric and parametric methods
- Nonparametric checks for single-index models
- On Principal Hessian Directions for Data Visualization and Dimension Reduction: Another Application of Stein's Lemma
- On Sliced Inverse Regression With High-Dimensional Covariates
- On hybrid methods of inverse regression-based algorithms
- On the Interpretation of Regression Plots
- Save: a method for dimension reduction and graphics in regression
- Simultaneous Equations and Canonical Correlation Theory
- Sliced Inverse Regression for Dimension Reduction
- Sufficient Dimension Reduction via Inverse Regression
- Using the Bootstrap to Select One of a New Class of Dimension Reduction Methods
Cited in
(49)- On determining the structural dimension via directional regression
- Partial central subspace and sliced average variance estimation
- Projection divergence in the reproducing kernel Hilbert space: asymptotic normality, block-wise and slicing estimation, and computational efficiency
- Penalized Weighted Variance Estimate for Dimension Reduction
- Computational Outlier Detection Methods in Sliced Inverse Regression
- Advanced topics in sliced inverse regression
- Sufficient dimension reduction in regressions through cumulative Hessian directions
- Multi-index regression models with missing covariates at random
- On splines approximation for sliced average variance estimation
- General directional regression
- Series expansion for functional sufficient dimension reduction
- Inference for the Dimension of a Regression Relationship Using Pseudo-Covariates
- The hybrid method of FSIR and FSAVE for functional effective dimension reduction
- Fused sliced average variance estimation
- On expectile-assisted inverse regression estimation for sufficient dimension reduction
- Nonlinear surface regression with dimension reduction method
- An alternating determination-optimization approach for an additive multi-index model
- Coordinate-independent sparse sufficient dimension reduction and variable selection
- SAVE: Robust or not?
- Data-driven slicing for dimension reduction in regressions: A likelihood-ratio approach
- Sliced Independence Test
- Sliced average variance estimation for multivariate time series
- On Partial Sufficient Dimension Reduction With Applications to Partially Linear Multi-Index Models
- Principal support vector machines for linear and nonlinear sufficient dimension reduction
- Distributed estimation in heterogeneous reduced rank regression: with application to order determination in sufficient dimension reduction
- Strong consistency of kernel method for sliced average variance estimation
- On kernel method for sliced average variance estimation
- On the extension of sliced average variance estimation to multivariate regression
- Estimation for a partial-linear single-index model
- A data-adaptive hybrid method for dimension reduction
- Distributed Sufficient Dimension Reduction for Heterogeneous Massive Data
- Functional sufficient dimension reduction: convergence rates and multiple functional case
- Dimension reduction in regressions through weighted variance estimation
- Generalized kernel-based inverse regression methods for sufficient dimension reduction
- Recursive kernel estimator in a semiparametric regression model
- Missing data analysis with sufficient dimension reduction
- Bagging Versions of Sliced Inverse Regression
- Contour projected dimension reduction
- An ensemble of inverse moment estimators for sufficient dimension reduction
- Dimension reduction via adaptive slicing
- Sliced average variance estimation for tensor data
- Partially linear estimation using sufficient dimension reduction
- An empirical process view of inverse regression
- Dimension reduction for the conditional \(k\)th moment via central solution space
- Dimension reduction using the generalized gradient direction
- Variable importance assessment in sliced inverse regression for variable selection
- A graphical tool for selecting the number of slices and the dimension of the model in SIR and SAVE approaches
- Dimension reduction based on weighted variance estimate
- Variable-dependent partial dimension reduction
This page was built for publication: Asymptotics for sliced average variance estimation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q997370)