Coordinate-independent sparse sufficient dimension reduction and variable selection
From MaRDI portal
Abstract: Sufficient dimension reduction (SDR) in regression, which reduces the dimension by replacing original predictors with a minimal set of their linear combinations without loss of information, is very helpful when the number of predictors is large. The standard SDR methods suffer because the estimated linear combinations usually consist of all original predictors, making it difficult to interpret. In this paper, we propose a unified method - coordinate-independent sparse estimation (CISE) - that can simultaneously achieve sparse sufficient dimension reduction and screen out irrelevant and redundant variables efficiently. CISE is subspace oriented in the sense that it incorporates a coordinate-independent penalty term with a broad series of model-based and model-free SDR approaches. This results in a Grassmann manifold optimization problem and a fast algorithm is suggested. Under mild conditions, based on manifold theories and techniques, it can be shown that CISE would perform asymptotically as well as if the true irrelevant predictors were known, which is referred to as the oracle property. Simulation studies and a real-data example demonstrate the effectiveness and efficiency of the proposed approach.
Recommendations
- Sparse sufficient dimension reduction
- Sparse sufficient dimension reduction using optimal scoring
- Sparse minimum discrepancy approach to sufficient dimension reduction with simultaneous variable selection in ultrahigh dimension
- On the consistency of coordinate-independent sparse estimation with BIC
- Estimating a sparse reduction for general regression in high dimensions
Cites work
- scientific article; zbMATH DE number 3983386 (Why is no real title available?)
- scientific article; zbMATH DE number 1220060 (Why is no real title available?)
- scientific article; zbMATH DE number 1850469 (Why is no real title available?)
- scientific article; zbMATH DE number 788275 (Why is no real title available?)
- A note on shrinkage sliced inverse regression
- Asymptotics for sliced average variance estimation
- Contour regression: a general approach to dimension reduction
- Dimension Reduction for the Conditionalkth Moment in Regression
- Dimension reduction based on constrained canonical correlation and variable filtering
- Dimension reduction for conditional mean in regression
- Fisher lecture: Dimension reduction in regression
- Ideal spatial adaptation by wavelet shrinkage
- Likelihood-based sufficient dimension reduction
- Model Selection and Estimation in Regression with Grouped Variables
- Model-Free Variable Selection
- On Directional Regression for Dimension Reduction
- On Principal Hessian Directions for Data Visualization and Dimension Reduction: Another Application of Stein's Lemma
- On consistency and sparsity for principal components analysis in high dimensions
- On minimization on Stiefel manifolds
- On the Interpretation of Regression Plots
- Optimization algorithms exploiting unitary constraints
- Principal Hessian Directions Revisited
- Principal fitted components for dimension reduction in regression
- Regression Model Selection—A Residual Likelihood Approach
- Sliced Inverse Regression for Dimension Reduction
- Smooth nonlinear optimization of \(\mathbb R^n\)
- Sparse sufficient dimension reduction
- Sufficient dimension reduction in regressions with categorical predictors
- Testing predictor contributions in sufficient dimension reduction.
- The Adaptive Lasso and Its Oracle Properties
- The Geometry of Algorithms with Orthogonality Constraints
- Tuning parameter selectors for the smoothly clipped absolute deviation method
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
Cited in
(78)- Concordance-based estimation approaches for the optimal sufficient dimension reduction score
- Sufficient dimension reduction for censored predictors
- Sufficient dimension reduction via random-partitions for the large-\(p\)-small-\(n\) problem
- Some aspects of response variable selection and estimation in multivariate linear regression
- Envelope-based sparse partial least squares
- A distribution-based Lasso for a general single-index model
- Variable selection and estimation for semi-parametric multiple-index models
- A selective overview of sparse sufficient dimension reduction
- MM algorithms for distance covariance based sufficient dimension reduction and sufficient variable selection
- Sparse sufficient dimension reduction for Markov blanket discovery
- Sufficient dimension folding via tensor inverse regression
- Model averaging assisted sufficient dimension reduction
- Model-free variable selection for conditional mean in regression
- On expectile-assisted inverse regression estimation for sufficient dimension reduction
- Sufficient dimension reduction in multivariate regressions with categorical predictors
- Nonlinear surface regression with dimension reduction method
- Semiparametric model average prediction in panel data analysis
- Robust sufficient dimension reduction via ball covariance
- Simultaneous estimation for semi-parametric multi-index models
- Dimension-wise sparse low-rank approximation of a matrix with application to variable selection in high-dimensional integrative analyzes of association
- A dimension reduction based approach for estimation and variable selection in partially linear single-index models with high-dimensional covariates
- Sparse minimum discrepancy approach to sufficient dimension reduction with simultaneous variable selection in ultrahigh dimension
- Supervised dimension reduction for ordinal predictors
- Bayesian model averaging sliced inverse regression
- The maximum separation subspace in sufficient dimension reduction with categorical response
- On the conditional distributions of low-dimensional projections from high-dimensional data
- Dimension reduction regressions with measurement errors subject to additive distortion
- Partial linear single index models with distortion measurement errors
- Sparse SIR: optimal rates and adaptive estimation
- Dimension reduction based on conditional multiple index density function
- On Partial Sufficient Dimension Reduction With Applications to Partially Linear Multi-Index Models
- A sequential test for variable selection in high dimensional complex data
- Sparsity considerations for dependent variables
- Sharp variable selection of a sparse submatrix in a high-dimensional noisy matrix
- Sufficient dimension reduction using Hilbert-Schmidt independence criterion
- Covariate information matrix for sufficient dimension reduction
- Sparse sufficient dimension reduction using optimal scoring
- On post dimension reduction statistical inference
- Sparse sufficient dimension reduction
- Fourier transform sparse inverse regression estimators for sufficient variable selection
- Penalized Orthogonal Iteration for Sparse Estimation of Generalized Eigenvalue Problem
- Dimension reduction for block-missing data based on sparse sliced inverse regression
- On the single-index model estimate of the conditional density function: consistency and implementation
- A Review on Sliced Inverse Regression, Sufficient Dimension Reduction, and Applications
- Feature filter for estimating central mean subspace and its sparse solution
- A model-free variable selection method for reducing the number of redundant variables
- The effect of data contamination in sliced inverse regression and finite sample breakdown point
- Estimating a sparse reduction for general regression in high dimensions
- An ensemble of inverse moment estimators for sufficient dimension reduction
- QUADRO: a supervised dimension reduction method via Rayleigh quotient optimization
- Dimension reduction via adaptive slicing
- Minimax sparse principal subspace estimation in high dimensions
- Ranking the importance of variables in nonlinear system identification
- On the consistency of coordinate-independent sparse estimation with BIC
- Learning sparse gradients for variable selection and dimension reduction
- Estimation and variable selection in partial linear single index models with error-prone linear covariates
- Trace pursuit variable selection for multi-population data
- Slicing-free inverse regression in high-dimensional sufficient dimension reduction
- High-Dimensional Elliptical Sliced Inverse Regression in Non-Gaussian Distributions
- Subspace Estimation with Automatic Dimension and Variable Selection in Sufficient Dimension Reduction
- Higher-order sliced inverse regressions
- Envelopes and principal component regression
- A Review of Envelope Models
- [HDDA] sparse subspace constrained partial least squares
- Matching Using Sufficient Dimension Reduction for Causal Inference
- High-dimensional sparse single-index regression via Hilbert-Schmidt independence criterion
- Sparse dimension reduction based on energy and ball statistics
- Testing the Linear Mean and Constant Variance Conditions in Sufficient Dimension Reduction
- High-dimensional local polynomial regression with variable selection and dimension reduction
- Variable selection using data splitting and projection for principal fitted component models in high dimension
- First-order algorithms for a class of fractional optimization problems
- Response variable selection in multivariate linear regression
- New forest-based approaches for sufficient dimension reduction
- Exact penalty function for \(\ell_{2,1}\) norm minimization over the Stiefel manifold
- Pruning a sufficient dimension reduction with a \(p\)-value guided hard-thresholding
- Efficient Sparse Estimate of Sufficient Dimension Reduction in High Dimension
- A penalty-free infeasible approach for a class of nonsmooth optimization problems over the Stiefel manifold
- Variable-dependent partial dimension reduction
This page was built for publication: Coordinate-independent sparse sufficient dimension reduction and variable selection
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q620565)