On Principal Hessian Directions for Data Visualization and Dimension Reduction: Another Application of Stein's Lemma
From MaRDI portal
Publication:4037624
DOI10.2307/2290640zbMath0765.62003OpenAlexW2115456160WikidataQ124816502 ScholiaQ124816502MaRDI QIDQ4037624
Publication date: 16 May 1993
Full work available at URL: https://escholarship.org/uc/item/38m3m5pq
eigenvectorssliced inverse regressionprincipal Hessian directionssignificance testStein lemmaexploratory projection pursuiteffective dimension reduction directionsregression surfaceHessian matrices of the regression functionlargest curvatures
Related Items
Dimension reduction based on conditional multiple index density function, Testing predictor contributions in sufficient dimension reduction., Nonlinear surface regression with dimension reduction method, A new sliced inverse regression method for multivariate response, A sequential test for variable selection in high dimensional complex data, Probability-enhanced effective dimension reduction for classifying sparse functional data, Model averaging assisted sufficient dimension reduction, On expectile-assisted inverse regression estimation for sufficient dimension reduction, Two cross-validation criteria for SIR\({}_\alpha\) and PSIR\({}_\alpha\) methods in view of prediction, Exploring regression structure with graphics, Improving estimated sufficient summary plots in dimension reduction using minimization criteria based on initial estimates, Dimension reduction and estimation in the secondary analysis of case-control studies, Estimation for a partial-linear single-index model, A unified approach to sufficient dimension reduction, Quantile-slicing estimation for dimension reduction in regression, Nonlinear confounding in high-dimensional regression, Principal quantile regression for sufficient dimension reduction with heteroscedasticity, Sufficient dimension reduction using Hilbert-Schmidt independence criterion, Robust estimation and variable selection in sufficient dimension reduction, Canonical kernel dimension reduction, On the conditional distributions of low-dimensional projections from high-dimensional data, Supervised dimension reduction for ordinal predictors, A note on least squares sensitivity in single-index model estimation and the benefits of response transformations, Estimation and inference on central mean subspace for multivariate response data, Sufficient dimension reduction in regressions through cumulative Hessian directions, Direction estimation in single-index models via distance covariance, General directional regression, \(T_3\)-plot for testing spherical symmetry for high-dimensional data with a small sample size, Coordinate-independent sparse sufficient dimension reduction and variable selection, Dimension reduction based linear surrogate variable approach for model free variable selection, Estimation of inverse mean: an orthogonal series approach, On model-free conditional coordinate tests for regressions, On determining the structural dimension via directional regression, A dimension reduction based approach for estimation and variable selection in partially linear single-index models with high-dimensional covariates, Learning sparse gradients for variable selection and dimension reduction, A note on sensitivity of principal component subspaces and the efficient detection of influential observations in high dimensions, Dimension reduction for regression estimation with nearest neighbor method, Sparse supervised dimension reduction in high dimensional classification, Iterative application of dimension reduction methods, A general theory for nonlinear sufficient dimension reduction: formulation and estimation, On a dimension reduction regression with covariate adjustment, An adaptive estimation of MAVE, Principal support vector machines for linear and nonlinear sufficient dimension reduction, Sufficient dimension reduction based on an ensemble of minimum average variance estimators, Dimension reduction for the conditional \(k\)th moment via central solution space, A note on statistical inference for differences of covariances, Unified predictor hypothesis tests in sufficient dimension reduction: a bootstrap approach, On principal Hessian directions for multivariate response regressions, Dimension reduction for the conditional mean in regressions with categorical predictors, Using DAGs to identify the sufficient dimension reduction in the principal fitted components model, Nearest neighbor inverse regression, Dimension reduction based on constrained canonical correlation and variable filtering, Corrections to test statistics in principal Hessian directions, Principal weighted logistic regression for sufficient dimension reduction in binary classification, Conditionally specified models and dimension reduction in the exponential families, Successive direction extraction for estimating the central subspace in a multiple-index regres\-sion, Tail dimension reduction for extreme quantile estimation, The effect of data contamination in sliced inverse regression and finite sample breakdown point, Efficient dimension reduction for multivariate response data, Sufficient dimension reduction and variable selection for regression mean function with two types of predictors, Sliced inverse regression in reference curves estimation, Cluster-based estimation for sufficient dimension reduction, Slice inverse regression with score functions, Dimension reduction summaries for balanced contrasts, Optimal estimation of slope vector in high-dimensional linear transformation models, Dimensionality reduction by feature clustering for regression problems, Exploiting predictor domain information in sufficient dimension reduction, A sparse eigen-decomposition estimation in semiparametric regression, Dimension reduction using the generalized gradient direction, Dimension reduction estimation for central mean subspace with missing multivariate response, Kernel sliced inverse regression: regularization and consistency, Connecting continuum regression with sufficient dimension reduction, Supervised principal component analysis: visualization, classification and regression on subspaces and submanifolds, Sliced inverse moment regression using weighted chi-squared tests for dimension reduction, Inverse regression-based uncertainty quantification algorithms for high-dimensional models: theory and practice, Iterative sliced inverse regression for segmentation of ultrasound and MR images, Asymptotics for sliced average variance estimation, Minimum average variance estimation with group Lasso for the multivariate response central mean subspace, A constructive approach to the estimation of dimension reduction directions, Generalized principal Hessian directions for mixture multivariate skew elliptical distributions, Quasi-likelihood estimation of the single index conditional variance model, Estimating multi-index models with response-conditional least squares, A new approach on recursive and non-recursive SIR methods, Dimension reduction for nonelliptically distributed predictors, An RKHS formulation of the inverse regression dimension-reduction problem, Sliced inverse regression for multivariate response regression, A nonlinear multi-dimensional variable selection method for high dimensional data: sparse MAVE, Asymptotic and bootstrap tests for subspace dimension, On the usage of joint diagonalization in multivariate statistics, Empirical likelihood of quantile difference with missing response when high-dimensional covariates are present, Kernel discriminant analysis for regression problems, Fourier transform sparse inverse regression estimators for sufficient variable selection, An integral transform method for estimating the central mean and central subspaces, Dimension reduction based on weighted variance estimate, Contour projected dimension reduction, Dimension-reduction type test for linearity of a stochastic regression model, Dimension reduction for conditional mean in regression, Sufficient dimension reduction in regressions with categorical predictors, A study of sensitivity analysis on the method of principal Hessian directions, Asymptotic distributions for testing dimensionality in \(q\)-based pHd., A selective overview of sparse sufficient dimension reduction, Online sparse sliced inverse regression for high-dimensional streaming data, A structured covariance ensemble for sufficient dimension reduction, A note on marginal coordinate test in sufficient dimension reduction, A quantile‐slicing approach for sufficient dimension reduction with censored responses, Dimension reduction with expectation of conditional difference measure, An Outer-Product-of-Gradient Approach to Dimension Reduction and its Application to Classification in High Dimensional Space, Generalized Liquid Association Analysis for Multimodal Data Integration, Inference for the Dimension of a Regression Relationship Using Pseudo-Covariates, High-dimensional local polynomial regression with variable selection and dimension reduction, A selective review of sufficient dimension reduction for multivariate response regression, Adaptive-to-Model Hybrid of Tests for Regressions, Distributed Estimation for Principal Component Analysis: An Enlarged Eigenspace Analysis, Subspace Estimation with Automatic Dimension and Variable Selection in Sufficient Dimension Reduction, Level Set Learning with Pseudoreversible Neural Networks for Nonlinear Dimension Reduction in Function Approximation, Nonlinear interaction detection through partial dimension reduction with missing response data, Sufficient dimension reduction for populations with structured heterogeneity, Sufficient dimension reduction via distance covariance with multivariate responses, Sliced Inverse Regression with Regularizations, The dual central subspaces in dimension reduction, A new estimator for efficient dimension reduction in regression, Gradient-Based Dimension Reduction of Multivariate Vector-Valued Functions, Using adaptively weighted large margin classifiers for robust sufficient dimension reduction, Overlapping sliced inverse regression for dimension reduction, A novel regularization method for estimation and variable selection in multi-index models, Computational Outlier Detection Methods in Sliced Inverse Regression, A note On outlier sensitivity of Sliced Inverse Regression, Testing the Linear Mean and Constant Variance Conditions in Sufficient Dimension Reduction, POOLED SLICING METHODS VERSUS SLICING METHODS, A Review on Sliced Inverse Regression, Sufficient Dimension Reduction, and Applications, Sliced Inverse Regression in Metric Spaces, Sparse Sliced Inverse Regression via Cholesky Matrix Penalization, Data-guided Treatment Recommendation with Feature Scores, Flexible regression modeling, Stein's lemma for truncated generalized skew-elliptical random vectors, An Adaptive Estimation of Dimension Reduction Space, High-dimensional sufficient dimension reduction through principal projections, Conditional variance estimator for sufficient dimension reduction, Penalized Weighted Variance Estimate for Dimension Reduction, Assessing Corrections to the Weighted Chi-Squared Test for Dimension, Approximate message passing with spectral initialization for generalized linear models*, On dimension folding of matrix- or array-valued statistical objects, Save: a method for dimension reduction and graphics in regression, Sparse sufficient dimension reduction with heteroscedasticity, A METHOD OF LOCAL INFLUENCE ANALYSIS IN SUFFICIENT DIMENSION REDUCTION, Determining the dimension of iterative Hessian transformation, Kernel dimension reduction in regression, On Distribution-Weighted Partial Least Squares with Diverging Number of Highly Correlated Predictors, Fourier transform approach for inverse dimension reduction method, Probability-enhanced sufficient dimension reduction for binary classification, Unnamed Item, Unnamed Item, Sufficient dimension reduction in the presence of controlling variables, On a new hybrid estimator for the central mean space, Central subspaces review: methods and applications, On relative efficiency of principal Hessian directions, Metric Learning via Cross-Validation, A robust proposal of estimation for the sufficient dimension reduction problem, Efficient and adaptive linear regression in semi-supervised settings, Estimation for single-index models via martingale difference divergence, On cross-distance selection algorithm for hybrid sufficient dimension reduction, An Inverse-regression Method of Dependent Variable Transformation for Dimension Reduction with Non-linear Confounding, Sufficient dimension reduction and prediction in regression: asymptotic results, Generalized kernel-based inverse regression methods for sufficient dimension reduction, Pairwise directions estimation for multivariate response regression data, Simultaneous estimation for semi-parametric multi-index models, Misspecified nonconvex statistical optimization for sparse phase retrieval, On post dimension reduction statistical inference, Estimation and variable selection in single-index composite quantile regression, Robust dimension reduction using sliced inverse median regression, Application of the Bootstrap Approach to the Choice of Dimension and the α Parameter in the SIRαMethod, Quantile regression and variable selection for the single-index model, A multi-index model for quantile regression with ordinal data, On efficient dimension reduction with respect to a statistical functional of interest, Partial dynamic dimension reduction for conditional mean in regression, Minimax adaptive dimension reduction for regression, Statistical learning on emerging economies, Dimension reduction in survival regressions with censored data via an imputed spline approach, On surrogate dimension reduction for measurement error regression: An invariance law, Statistical modelling via dimension reduction methods, Penalized Interaction Estimation for Ultrahigh Dimensional Quadratic Regression, On principal graphical models with application to gene network, Feature filter for estimating central mean subspace and its sparse solution, Graph informed sliced inverse regression, Sufficient Dimension Reduction via Direct Estimation of the Gradients of Logarithmic Conditional Densities, Dimension Reduction in Regressions through Weighted Variance Estimation, Sequential Learning of Active Subspaces, High-dimensional regression analysis with treatment comparisons, On Sensitivity of Inverse Response Plot Estimation and the Benefits of a Robust Estimation Approach, Sparse dimension reduction for survival data, OLS for 1D Regression Models, Dimension reduction via marginal high moments in regression, On the generalization of Stein's lemma for elliptical class of distributions, Extending Save and PHD, The sliced inverse regression algorithm as a maximum likelihood procedure, Moment-based dimension reduction for multivariate response regression, Model-based SIR for dimension reduction, Weighted composite quantile regression for single index model with missing covariates at random, A graphical tool for selecting the number of slices and the dimension of the model in SIR and SAVE approaches, A Shrinkage Estimation of Central Subspace in Sufficient Dimension Reduction, Covariate Information Matrix for Sufficient Dimension Reduction, Dimension reduction regressions with measurement errors subject to additive distortion, A note on sliced inverse regression with missing predictors, Inverse regression for ridge recovery: a data-driven approach for parameter reduction in computer experiments, Central quantile subspace, Tobit model estimation and sliced inverse regression, Functional sliced inverse regression analysis, Sufficient dimension reduction for conditional quantiles with alternative types of data, A note on structural adaptive dimension reduction, Adaptive-to-model checking for regressions with diverging number of predictors, Influence Functions for Dimension Reduction Methods: An Example Influence Study of Principal Hessian Direction Analysis, INFLUENCE FUNCTIONS FOR DIMENSION REDUCTION METHODS, Single-index modelling of conditional probabilities in two-way contingency tables, Fusing sufficient dimension reduction with neural networks, Gauss-Christoffel quadrature for inverse regression: applications to computer experiments, Optimal combination of linear and spectral estimators for generalized linear models, On central matrix based methods in dimension reduction, Robust sufficient dimension reduction via ball covariance, DETECTING INFLUENTIAL OBSERVATIONS IN SLICED INVERSE REGRESSION ANALYSIS, A note on cumulative mean estimation, Unnamed Item, Contour regression: a general approach to dimension reduction, Projective resampling estimation of informative predictor subspace for multivariate regression, A Semiparametric Approach to Dimension Reduction, Combining Linear Dimension Reduction Subspaces, Robust and resource-efficient identification of two hidden layer neural networks
Uses Software