Coordinate-independent sparse sufficient dimension reduction and variable selection
From MaRDI portal
(Redirected from Publication:620565)
Abstract: Sufficient dimension reduction (SDR) in regression, which reduces the dimension by replacing original predictors with a minimal set of their linear combinations without loss of information, is very helpful when the number of predictors is large. The standard SDR methods suffer because the estimated linear combinations usually consist of all original predictors, making it difficult to interpret. In this paper, we propose a unified method - coordinate-independent sparse estimation (CISE) - that can simultaneously achieve sparse sufficient dimension reduction and screen out irrelevant and redundant variables efficiently. CISE is subspace oriented in the sense that it incorporates a coordinate-independent penalty term with a broad series of model-based and model-free SDR approaches. This results in a Grassmann manifold optimization problem and a fast algorithm is suggested. Under mild conditions, based on manifold theories and techniques, it can be shown that CISE would perform asymptotically as well as if the true irrelevant predictors were known, which is referred to as the oracle property. Simulation studies and a real-data example demonstrate the effectiveness and efficiency of the proposed approach.
Recommendations
- Sparse sufficient dimension reduction
- Sparse sufficient dimension reduction using optimal scoring
- Sparse minimum discrepancy approach to sufficient dimension reduction with simultaneous variable selection in ultrahigh dimension
- On the consistency of coordinate-independent sparse estimation with BIC
- Estimating a sparse reduction for general regression in high dimensions
Cites work
- scientific article; zbMATH DE number 3983386 (Why is no real title available?)
- scientific article; zbMATH DE number 1220060 (Why is no real title available?)
- scientific article; zbMATH DE number 1850469 (Why is no real title available?)
- scientific article; zbMATH DE number 788275 (Why is no real title available?)
- A note on shrinkage sliced inverse regression
- Asymptotics for sliced average variance estimation
- Contour regression: a general approach to dimension reduction
- Dimension Reduction for the Conditionalkth Moment in Regression
- Dimension reduction based on constrained canonical correlation and variable filtering
- Dimension reduction for conditional mean in regression
- Fisher lecture: Dimension reduction in regression
- Ideal spatial adaptation by wavelet shrinkage
- Likelihood-based sufficient dimension reduction
- Model Selection and Estimation in Regression with Grouped Variables
- Model-Free Variable Selection
- On Directional Regression for Dimension Reduction
- On Principal Hessian Directions for Data Visualization and Dimension Reduction: Another Application of Stein's Lemma
- On consistency and sparsity for principal components analysis in high dimensions
- On minimization on Stiefel manifolds
- On the Interpretation of Regression Plots
- Optimization algorithms exploiting unitary constraints
- Principal Hessian Directions Revisited
- Principal fitted components for dimension reduction in regression
- Regression Model Selection—A Residual Likelihood Approach
- Sliced Inverse Regression for Dimension Reduction
- Smooth nonlinear optimization of \(\mathbb R^n\)
- Sparse sufficient dimension reduction
- Sufficient dimension reduction in regressions with categorical predictors
- Testing predictor contributions in sufficient dimension reduction.
- The Adaptive Lasso and Its Oracle Properties
- The Geometry of Algorithms with Orthogonality Constraints
- Tuning parameter selectors for the smoothly clipped absolute deviation method
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
Cited in
(78)- Efficient Sparse Estimate of Sufficient Dimension Reduction in High Dimension
- Minimax sparse principal subspace estimation in high dimensions
- Dimension reduction regressions with measurement errors subject to additive distortion
- Estimation and variable selection in partial linear single index models with error-prone linear covariates
- MM algorithms for distance covariance based sufficient dimension reduction and sufficient variable selection
- Model averaging assisted sufficient dimension reduction
- Model-free variable selection for conditional mean in regression
- On expectile-assisted inverse regression estimation for sufficient dimension reduction
- Simultaneous estimation for semi-parametric multi-index models
- Nonlinear surface regression with dimension reduction method
- Ranking the importance of variables in nonlinear system identification
- Variable selection using data splitting and projection for principal fitted component models in high dimension
- Sufficient dimension reduction using Hilbert-Schmidt independence criterion
- Semiparametric model average prediction in panel data analysis
- [HDDA] sparse subspace constrained partial least squares
- Subspace Estimation with Automatic Dimension and Variable Selection in Sufficient Dimension Reduction
- Dimension reduction based on conditional multiple index density function
- A sequential test for variable selection in high dimensional complex data
- Sparse sufficient dimension reduction using optimal scoring
- The effect of data contamination in sliced inverse regression and finite sample breakdown point
- Sufficient dimension folding via tensor inverse regression
- A selective overview of sparse sufficient dimension reduction
- Supervised dimension reduction for ordinal predictors
- Dimension reduction via adaptive slicing
- Testing the Linear Mean and Constant Variance Conditions in Sufficient Dimension Reduction
- Sparse dimension reduction based on energy and ball statistics
- Variable selection and estimation for semi-parametric multiple-index models
- Sufficient dimension reduction via random-partitions for the large-\(p\)-small-\(n\) problem
- Feature filter for estimating central mean subspace and its sparse solution
- Covariate information matrix for sufficient dimension reduction
- Sparse minimum discrepancy approach to sufficient dimension reduction with simultaneous variable selection in ultrahigh dimension
- The maximum separation subspace in sufficient dimension reduction with categorical response
- Bayesian model averaging sliced inverse regression
- Response variable selection in multivariate linear regression
- Learning sparse gradients for variable selection and dimension reduction
- Variable-dependent partial dimension reduction
- Sparse SIR: optimal rates and adaptive estimation
- On post dimension reduction statistical inference
- Estimating a sparse reduction for general regression in high dimensions
- QUADRO: a supervised dimension reduction method via Rayleigh quotient optimization
- On the consistency of coordinate-independent sparse estimation with BIC
- Sufficient dimension reduction for censored predictors
- Sufficient dimension reduction in multivariate regressions with categorical predictors
- Robust sufficient dimension reduction via ball covariance
- Sparse sufficient dimension reduction for Markov blanket discovery
- Envelopes and principal component regression
- A Review of Envelope Models
- High-Dimensional Elliptical Sliced Inverse Regression in Non-Gaussian Distributions
- Penalized Orthogonal Iteration for Sparse Estimation of Generalized Eigenvalue Problem
- On the single-index model estimate of the conditional density function: consistency and implementation
- Dimension-wise sparse low-rank approximation of a matrix with application to variable selection in high-dimensional integrative analyzes of association
- High-dimensional local polynomial regression with variable selection and dimension reduction
- A dimension reduction based approach for estimation and variable selection in partially linear single-index models with high-dimensional covariates
- New forest-based approaches for sufficient dimension reduction
- An ensemble of inverse moment estimators for sufficient dimension reduction
- Some aspects of response variable selection and estimation in multivariate linear regression
- Envelope-based sparse partial least squares
- On the conditional distributions of low-dimensional projections from high-dimensional data
- A model-free variable selection method for reducing the number of redundant variables
- Trace pursuit variable selection for multi-population data
- Partial linear single index models with distortion measurement errors
- Sparsity considerations for dependent variables
- Dimension reduction for block-missing data based on sparse sliced inverse regression
- Concordance-based estimation approaches for the optimal sufficient dimension reduction score
- Slicing-free inverse regression in high-dimensional sufficient dimension reduction
- On Partial Sufficient Dimension Reduction With Applications to Partially Linear Multi-Index Models
- A distribution-based Lasso for a general single-index model
- Exact penalty function for \(\ell_{2,1}\) norm minimization over the Stiefel manifold
- Fourier transform sparse inverse regression estimators for sufficient variable selection
- Sharp variable selection of a sparse submatrix in a high-dimensional noisy matrix
- Higher-order sliced inverse regressions
- Matching Using Sufficient Dimension Reduction for Causal Inference
- A Review on Sliced Inverse Regression, Sufficient Dimension Reduction, and Applications
- Pruning a sufficient dimension reduction with a \(p\)-value guided hard-thresholding
- A penalty-free infeasible approach for a class of nonsmooth optimization problems over the Stiefel manifold
- High-dimensional sparse single-index regression via Hilbert-Schmidt independence criterion
- First-order algorithms for a class of fractional optimization problems
- Sparse sufficient dimension reduction
This page was built for publication: Coordinate-independent sparse sufficient dimension reduction and variable selection
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q620565)