High-dimensional sufficient dimension reduction through principal projections
DOI10.1214/22-EJS1988zbMath1493.62320OpenAlexW4226485703MaRDI QIDQ2136660
Andreas Artemiou, Eugen Pircalabelu
Publication date: 11 May 2022
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/journals/electronic-journal-of-statistics/volume-16/issue-1/High-dimensional-sufficient-dimension-reduction-through-principal-projections/10.1214/22-EJS1988.full
quadratic programmingsupport vector machinessufficient dimension reduction\(\ell_1\) penalized estimationdebiased estimator
Factor analysis and principal components; correspondence analysis (62H25) Estimation in multivariate analysis (62H12) Hypothesis testing in multivariate analysis (62H15) General nonlinear regression (62J02)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Sparse estimation of a covariance matrix
- A well-conditioned estimator for large-dimensional covariance matrices
- Sparse Sliced Inverse Regression Via Lasso
- Sufficient dimension reduction via principal L\(q\) support vector machine
- Confidence intervals for high-dimensional partially linear single-index models
- Fisher lecture: Dimension reduction in regression
- Principal support vector machines for linear and nonlinear sufficient dimension reduction
- High-dimensional covariance matrix estimation in approximate factor models
- Statistics for high-dimensional data. Methods, theory and applications.
- Successive direction extraction for estimating the central subspace in a multiple-index regres\-sion
- Optimal rates of convergence for covariance matrix estimation
- Covariance regularization by thresholding
- Penalized principal logistic regression for sparse sufficient dimension reduction
- On consistency and sparsity for sliced inverse regression in high dimensions
- Dimension reduction for conditional mean in regression
- Support-vector networks
- Graph informed sliced inverse regression
- Simultaneous analysis of Lasso and Dantzig selector
- Kernel dimension reduction in regression
- Contour regression: a general approach to dimension reduction
- A Cost Based Reweighted Scheme of Principal Support Vector Machine
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- A Constrainedℓ1Minimization Approach to Sparse Precision Matrix Estimation
- On Directional Regression for Dimension Reduction
- Sliced Inverse Regression for Dimension Reduction
- On Principal Hessian Directions for Data Visualization and Dimension Reduction: Another Application of Stein's Lemma
- Sufficient dimension reduction based on distance‐weighted discrimination
- Dimension Reduction in Regressions Through Cumulative Slicing Estimation
- Principal weighted support vector machines for sufficient dimension reduction in binary classification
- Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models
- Comment
This page was built for publication: High-dimensional sufficient dimension reduction through principal projections