Sufficient dimension reduction and prediction in regression

From MaRDI portal
Publication:3559952

DOI10.1098/rsta.2009.0110zbMath1185.62109OpenAlexW2022215192WikidataQ51787299 ScholiaQ51787299MaRDI QIDQ3559952

Kofi P. Adragni, R. Dennis Cook

Publication date: 8 May 2010

Published in: Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences (Search for Journal in Brave)

Full work available at URL: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.517.789



Related Items

Structured Ordinary Least Squares: A Sufficient Dimension Reduction approach for regressions with partitioned predictors and heterogeneous units, A sequential test for variable selection in high dimensional complex data, The ensemble conditional variance estimator for sufficient dimension reduction, Conditional variance estimator for sufficient dimension reduction, Group-wise sufficient dimension reduction with principal fitted components, Dimension Reduction Forests: Local Variable Importance Using Structured Random Forests, On weighted multivariate sign functions, Graph-Assisted Inverse Regression for Count Data and Its Application to Sequencing Data, Sufficient dimension reduction constrained through sub-populations, Supervised dimension reduction for ordinal predictors, Generalized Tensor Decomposition With Features on Multiple Modes, Calibrating sufficiently, Nonlinear multi-output regression on unknown input manifold, Sufficient dimension reduction and prediction in regression: asymptotic results, Inverse regression approach to robust nonlinear high-to-low dimensional mapping, Single-index importance sampling with stratification, Self-supervised Metric Learning in Multi-View Data: A Downstream Task Perspective, Mining the factor zoo: estimation of latent factor models with sufficient proxies, Level Set Learning with Pseudoreversible Neural Networks for Nonlinear Dimension Reduction in Function Approximation, Unnamed Item, Independent screening in high-dimensional exponential family predictors’ space, Estimating covariance and precision matrices along subspaces, Pruning a sufficient dimension reduction with ap-value guided hard-thresholding, High-dimensional variable screening and bias in subsequent inference, with an empirical comparison, Data-Driven Polynomial Ridge Approximation Using Variable Projection, Estimating sufficient reductions of the predictors in abundant high-dimensional regressions, Inverse regression-based uncertainty quantification algorithms for high-dimensional models: theory and practice, A simple measure of conditional dependence, Sufficient dimension reduction and prediction through cumulative slicing PFC, Dimension Reduction via Gaussian Ridge Functions, Estimating multi-index models with response-conditional least squares, Inverse regression for ridge recovery: a data-driven approach for parameter reduction in computer experiments, Gradient-Based Dimension Reduction of Multivariate Vector-Valued Functions, A slice of multivariate dimension reduction, Lurking Variable Detection via Dimensional Analysis, Gauss-Christoffel quadrature for inverse regression: applications to computer experiments, Nonlinear Level Set Learning for Function Approximation on Sparse Data with Applications to Parametric Differential Equations, High-dimensional regression with Gaussian mixtures and partially-latent response variables


Uses Software


Cites Work