Sparse dimension reduction based on energy and ball statistics
From MaRDI portal
Publication:6161663
Abstract: As its name suggests, sufficient dimension reduction (SDR) targets to estimate a subspace from data that contains all information sufficient to explain a dependent variable. Ample approaches exist to SDR, some of the most recent of which rely on minimal to no model assumptions. These are defined according to an optimization criterion that maximizes a nonparametric measure of association. The original estimators are nonsparse, which means that all variables contribute to the model. However, in many practical applications, an SDR technique may be called for that is sparse and as such, intrinsically performs sufficient variable selection (SVS). This paper examines how such a sparse SDR estimator can be constructed. Three variants are investigated, depending on different measures of association: distance covariance, martingale difference divergence and ball covariance. A simulation study shows that each of these estimators can achieve correct variable selection in highly nonlinear contexts, yet are sensitive to outliers and computationally intensive. The study sheds light on the subtle differences between the methods. Two examples illustrate how these new estimators can be applied in practice, with a slight preference for the option based on martingale difference divergence in the bioinformatics example.
Cites work
- scientific article; zbMATH DE number 1373656 (Why is no real title available?)
- A martingale-difference-divergence-based estimation of central mean subspace
- Ball Covariance: A Generic Measure of Dependence in Banach Space
- Coordinate-independent sparse sufficient dimension reduction and variable selection
- Dimension reduction based on constrained canonical correlation and variable filtering
- Distance covariance in metric spaces
- Energy statistics: a class of statistics based on distances
- Hedonic housing prices and the demand for clean air
- Martingale difference correlation and its use in high-dimensional variable screening
- Measuring and testing dependence by correlation of distances
- On the implementation of an interior-point filter line-search algorithm for large-scale nonlinear programming
- Partial martingale difference correlation
- Principal angles between subspaces in an A-based scalar product: Algorithms and perturbation estimates
- Projection pursuit
- Robust sufficient dimension reduction via ball covariance
- Sequential sufficient dimension reduction for large \(p\), small \(n\) problems
- Shrinkage Inverse Regression Estimation for Model-Free Variable Selection
- Sliced Inverse Regression with Regularizations
- Sparse Partial Least Squares Regression for Simultaneous Dimension Reduction and Variable Selection
- Sparse principal component regression via singular value decomposition approach
- Sparse sliced inverse regression via Lasso
- Sparse sufficient dimension reduction
- Sufficient dimension reduction: methods and applications with R
- Sure independence screening for ultrahigh dimensional feature space. With discussion and authors' reply
- Testing predictor contributions in sufficient dimension reduction.
- The Geometry of Algorithms with Orthogonality Constraints
- The distance correlation \(t\)-test of independence in high dimension
- Use of Ranks in One-Criterion Variance Analysis
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
This page was built for publication: Sparse dimension reduction based on energy and ball statistics
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6161663)