High-dimensional sparse single-index regression via Hilbert-Schmidt independence criterion
DOI10.1007/S11222-024-10399-4zbMATH Open1539.62014MaRDI QIDQ6547751FDOQ6547751
Xin Chen, Shuaida He, Runxiong Wu, Chang Deng, J. Zhang
Publication date: 31 May 2024
Published in: Statistics and Computing (Search for Journal in Brave)
variable selectionsufficient dimension reductionmajorization-minimizationsingle-index modelslarge \(p\) small \(n\)Hilbert-Schmidt independence criterion
Computational methods for problems pertaining to statistics (62-08) Nonparametric regression and quantile regression (62G08) Estimation in multivariate analysis (62H12)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Generalized alternating direction method of multipliers: new theoretical insights and applications
- Sliced Regression for Dimension Reduction
- Approximation Theorems of Mathematical Statistics
- On consistency and sparsity for sliced inverse regression in high dimensions
- High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}
- Testing predictor contributions in sufficient dimension reduction.
- Sliced Inverse Regression for Dimension Reduction
- Feature Screening via Distance Correlation Learning
- A note on shrinkage sliced inverse regression
- Sliced Inverse Regression with Regularizations
- Comment
- An Adaptive Estimation of Dimension Reduction Space
- Algorithmic Learning Theory
- Discussion of: Brownian distance covariance
- Regression analysis under link violation
- Fourier Methods for Estimating the Central Subspace and the Central Mean Subspace in Regression
- On Directional Regression for Dimension Reduction
- On the Interpretation of Regression Plots
- A Semiparametric Approach to Dimension Reduction
- Sparse Generalized Eigenvalue Problem: Optimal Statistical Rates via Truncated Rayleigh Flow
- Likelihood-Based Sufficient Dimension Reduction
- Model-Free Variable Selection
- Sufficient Dimension Reduction via Inverse Regression
- Principal fitted components for dimension reduction in regression
- Sparse Sliced Inverse Regression Via Lasso
- Sparse sufficient dimension reduction
- Linearized augmented Lagrangian and alternating direction methods for nuclear norm minimization
- A unified primal-dual algorithm framework based on Bregman iteration
- The Linearized Alternating Direction Method of Multipliers for Dantzig Selector
- Successive direction extraction for estimating the central subspace in a multiple-index regres\-sion
- Graphics for Regressions With a Binary Response
- Coordinate-independent sparse sufficient dimension reduction and variable selection
- An integral transform method for estimating the central mean and central subspaces
- Sufficient dimension reduction based on an ensemble of minimum average variance estimators
- Estimating a sparse reduction for general regression in high dimensions
- Direction Estimation in Single-Index Regressions via Hilbert-Schmidt Independence Criterion
- A Review on Dimension Reduction
- Sequential Sufficient Dimension Reduction for Large p, Small n Problems
- Sparse SIR: optimal rates and adaptive estimation
- A convex formulation for high-dimensional sparse sliced inverse regression
- Sparse Minimum Discrepancy Approach to Sufficient Dimension Reduction with Simultaneous Variable Selection in Ultrahigh Dimension
- MM algorithms for distance covariance based sufficient dimension reduction and sufficient variable selection
- Sparse CCA: adaptive estimation and computational barriers
- Subspace Estimation with Automatic Dimension and Variable Selection in Sufficient Dimension Reduction
- Efficient Sparse Estimate of Sufficient Dimension Reduction in High Dimension
This page was built for publication: High-dimensional sparse single-index regression via Hilbert-Schmidt independence criterion
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6547751)