Comment
From MaRDI portal
Publication:5892253
DOI10.1080/01621459.1991.10475036zbMath1353.62037OpenAlexW4230676649WikidataQ60571522 ScholiaQ60571522MaRDI QIDQ5892253
R. Dennis Cook, Sanford Weisberg
Publication date: 20 January 2017
Published in: Journal of the American Statistical Association (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/01621459.1991.10475036
Multivariate analysis (62H99) Factor analysis and principal components; correspondence analysis (62H25) Density estimation (62G07)
Related Items (only showing first 100 items - show all)
Impact of sufficient dimension reduction in nonparametric estimation of causal effect ⋮ Pseudo likelihood and dimension reduction for data with nonignorable nonresponse ⋮ A selective overview of sparse sufficient dimension reduction ⋮ Quantile treatment effect estimation with dimension reduction ⋮ Online sparse sliced inverse regression for high-dimensional streaming data ⋮ Covariate Information Number for Feature Screening in Ultrahigh-Dimensional Supervised Problems ⋮ Recursive kernel estimator in a semiparametric regression model ⋮ Estimation of value-at-risk using single index quantile regression ⋮ Quantile Martingale Difference Divergence for Dimension Reduction ⋮ Dimension reduction via adaptive slicing ⋮ Structured Ordinary Least Squares: A Sufficient Dimension Reduction approach for regressions with partitioned predictors and heterogeneous units ⋮ Sliced Inverse Regression in Metric Spaces ⋮ A Minimum Discrepancy Approach With Fourier Transform in Sufficient Dimension Reduction ⋮ Likelihood-Based Dimension Folding on Tensor Data ⋮ Distributed Sufficient Dimension Reduction for Heterogeneous Massive Data ⋮ Dimension Reduction Forests: Local Variable Importance Using Structured Random Forests ⋮ Sparse sufficient dimension reduction with heteroscedasticity ⋮ Graph-Assisted Inverse Regression for Count Data and Its Application to Sequencing Data ⋮ Model-Free Variable Selection With Matrix-Valued Predictors ⋮ A METHOD OF LOCAL INFLUENCE ANALYSIS IN SUFFICIENT DIMENSION REDUCTION ⋮ K-medoids inverse regression ⋮ Variable importance assessment in sliced inverse regression for variable selection ⋮ SAVE: Robust or not? ⋮ Model Checking in Large-Scale Dataset via Structure-Adaptive-Sampling ⋮ Nonparametric Estimation of Conditional Expectation with Auxiliary Information and Dimension Reduction ⋮ A new reproducing kernel‐based nonlinear dimension reduction method for survival data ⋮ A structured covariance ensemble for sufficient dimension reduction ⋮ Variable-dependent partial dimension reduction ⋮ Strong consistency of kernel method for sliced average variance estimation ⋮ Missing data analysis with sufficient dimension reduction ⋮ A note on marginal coordinate test in sufficient dimension reduction ⋮ Dimension reduction with expectation of conditional difference measure ⋮ Inference for the Dimension of a Regression Relationship Using Pseudo-Covariates ⋮ High-dimensional local polynomial regression with variable selection and dimension reduction ⋮ Pairwise directions estimation for multivariate response regression data ⋮ Simultaneous estimation for semi-parametric multi-index models ⋮ A selective review of sufficient dimension reduction for multivariate response regression ⋮ Sufficient dimension folding via tensor inverse regression ⋮ Tail inverse regression: dimension reduction for prediction of extremes ⋮ Level Set Learning with Pseudoreversible Neural Networks for Nonlinear Dimension Reduction in Function Approximation ⋮ On forward sufficient dimension reduction for categorical and ordinal responses ⋮ Projection expectile regression for sufficient dimension reduction ⋮ On a nonlinear extension of the principal fitted component model ⋮ Sufficient variable screening with high-dimensional controls ⋮ A multi-index model for quantile regression with ordinal data ⋮ Dealing with big data: comparing dimension reduction and shrinkage regression methods ⋮ A link-free sparse group variable selection method for single-index model ⋮ Statistical learning on emerging economies ⋮ Sufficient dimension reduction via distance covariance with multivariate responses ⋮ Sliced average variance estimation for multivariate time series ⋮ Data-driven slicing for dimension reduction in regressions: A likelihood-ratio approach ⋮ The dual central subspaces in dimension reduction ⋮ Covariate Information Matrix for Sufficient Dimension Reduction ⋮ Gradient-Based Dimension Reduction of Multivariate Vector-Valued Functions ⋮ On double-index dimension reduction for partially functional data ⋮ Overlapping sliced inverse regression for dimension reduction ⋮ Sparse Minimum Discrepancy Approach to Sufficient Dimension Reduction with Simultaneous Variable Selection in Ultrahigh Dimension ⋮ On sufficient dimension reduction via principal asymmetric least squares ⋮ Entropy-based sliced inverse regression ⋮ Dimension reduction with missing response at random ⋮ Sufficient dimension reduction via principal L\(q\) support vector machine ⋮ Dimension reduction based on conditional multiple index density function ⋮ Testing predictor contributions in sufficient dimension reduction. ⋮ Nonlinear surface regression with dimension reduction method ⋮ Asymptotic distribution of test statistic for the covariance dimension reduction methods in regression ⋮ MM algorithms for distance covariance based sufficient dimension reduction and sufficient variable selection ⋮ Probability-enhanced effective dimension reduction for classifying sparse functional data ⋮ On expectile-assisted inverse regression estimation for sufficient dimension reduction ⋮ On permutation tests for predictor contribution in sufficient dimension reduction ⋮ Two cross-validation criteria for SIR\({}_\alpha\) and PSIR\({}_\alpha\) methods in view of prediction ⋮ An outlier robust unit root test with an application to the extended Nelson-Plosser data ⋮ Improving estimated sufficient summary plots in dimension reduction using minimization criteria based on initial estimates ⋮ Group-wise sufficient dimension reduction with principal fitted components ⋮ On a new class of sufficient dimension reduction estimators ⋮ A unified approach to sufficient dimension reduction ⋮ Nonlinear confounding in high-dimensional regression ⋮ Dimension reduction estimation for probability density with data missing at random when covariables are present ⋮ Principal quantile regression for sufficient dimension reduction with heteroscedasticity ⋮ Student sliced inverse regression ⋮ Sufficient dimension reduction using Hilbert-Schmidt independence criterion ⋮ Partial projective resampling method for dimension reduction: with applications to partially linear models ⋮ Robust estimation and variable selection in sufficient dimension reduction ⋮ Canonical kernel dimension reduction ⋮ Ensemble sufficient dimension folding methods for analyzing matrix-valued data ⋮ Principal minimax support vector machine for sufficient dimension reduction with contaminated data ⋮ On dual model-free variable selection with two groups of variables ⋮ Fused mean-variance filter for feature screening ⋮ Supervised dimension reduction for ordinal predictors ⋮ A note on least squares sensitivity in single-index model estimation and the benefits of response transformations ⋮ A bootstrap method for assessing the dimension of a general regression problem ⋮ Fused sliced average variance estimation ⋮ Sufficient dimension reduction in regressions through cumulative Hessian directions ⋮ Direction estimation in single-index models via distance covariance ⋮ Series expansion for functional sufficient dimension reduction ⋮ Penalized least squares for single index models ⋮ Functional sufficient dimension reduction: convergence rates and multiple functional case ⋮ Estimating variable structure and dependence in multitask learning via gradients ⋮ Dimension reduction based linear surrogate variable approach for model free variable selection ⋮ Estimating a sparse reduction for general regression in high dimensions ⋮ A note on the structure of the quadratic subspace in discriminant analysis
Uses Software
Cites Work
This page was built for publication: Comment