Principal support vector machines for linear and nonlinear sufficient dimension reduction
From MaRDI portal
(Redirected from Publication:449992)
principal componentsreproducing kernel Hilbert spaceinverse regressioncontour regressioninvariant kernel
Nonparametric regression and quantile regression (62G08) Factor analysis and principal components; correspondence analysis (62H25) Learning and adaptive systems in artificial intelligence (68T05) Estimation in multivariate analysis (62H12) Applications of functional analysis in probability theory and statistics (46N30) Graphical methods in statistics (62A09)
Abstract: We introduce a principal support vector machine (PSVM) approach that can be used for both linear and nonlinear sufficient dimension reduction. The basic idea is to divide the response variables into slices and use a modified form of support vector machine to find the optimal hyperplanes that separate them. These optimal hyperplanes are then aligned by the principal components of their normal vectors. It is proved that the aligned normal vectors provide an unbiased, -consistent, and asymptotically normal estimator of the sufficient dimension reduction space. The method is then generalized to nonlinear sufficient dimension reduction using the reproducing kernel Hilbert space. In that context, the aligned normal vectors become functions and it is proved that they are unbiased in the sense that they are functions of the true nonlinear sufficient predictors. We compare PSVM with other sufficient dimension reduction methods by simulation and in real data analysis, and through both comparisons firmly establish its practical advantages.
Recommendations
- Sufficient dimension reduction via principal L\(q\) support vector machine
- Principal weighted support vector machines for sufficient dimension reduction in binary classification
- A cost based reweighted scheme of principal support vector machine
- A general theory for nonlinear sufficient dimension reduction: formulation and estimation
- High-dimensional sufficient dimension reduction through principal projections
Cites work
- scientific article; zbMATH DE number 5957198 (Why is no real title available?)
- scientific article; zbMATH DE number 5968880 (Why is no real title available?)
- scientific article; zbMATH DE number 3676608 (Why is no real title available?)
- scientific article; zbMATH DE number 47995 (Why is no real title available?)
- scientific article; zbMATH DE number 1220060 (Why is no real title available?)
- scientific article; zbMATH DE number 1332320 (Why is no real title available?)
- scientific article; zbMATH DE number 490141 (Why is no real title available?)
- scientific article; zbMATH DE number 1779487 (Why is no real title available?)
- scientific article; zbMATH DE number 1850469 (Why is no real title available?)
- A characterization of spherical distributions
- A nonlinear multi-dimensional variable selection method for high dimensional data: sparse MAVE
- Algorithmic Learning Theory
- An Adaptive Estimation of Dimension Reduction Space
- An RKHS formulation of the inverse regression dimension-reduction problem
- Asymptotic Statistics
- Asymptotics for sliced average variance estimation
- Comment
- Contour regression: a general approach to dimension reduction
- Convex functional analysis
- Dimension Reduction for the Conditionalkth Moment in Regression
- Dimension reduction for conditional mean in regression
- Dimension reduction for nonelliptically distributed predictors
- Estimating the dimension of a model
- Fisher lecture: Dimension reduction in regression
- Graphics for Regressions With a Binary Response
- Kernel dimension reduction in regression
- Nonparametric estimating equations based on a penalized information criterion
- On Directional Regression for Dimension Reduction
- On Principal Hessian Directions for Data Visualization and Dimension Reduction: Another Application of Stein's Lemma
- On Quasi Likelihood Equations with Non-parametric Weights
- On Sliced Inverse Regression With High-Dimensional Covariates
- On almost linearity of low dimensional projections from high dimensional data
- On the distribution of the left singular vectors of a random matrix and its applications
- Principal fitted components for dimension reduction in regression
- Regression analysis under link violation
- Sliced Inverse Regression for Dimension Reduction
- Successive direction extraction for estimating the central subspace in a multiple-index regres\-sion
- Sufficient Dimension Reduction via Inverse Regression
- The commutation matrix: Some properties and applications
- Theory of Reproducing Kernels
Cited in
(46)- Nonlinear dimension reduction for conditional quantiles
- Predictive power of principal components for single-index model and sufficient dimension reduction
- A brief review of linear sufficient dimension reduction through optimization
- Using DAGs to identify the sufficient dimension reduction in the principal fitted components model
- Dimension reduction techniques for conditional expectiles
- Using adaptively weighted large margin classifiers for robust sufficient dimension reduction
- Dimension reduction in binary response regression: a joint modeling approach
- On expectile-assisted inverse regression estimation for sufficient dimension reduction
- Sliced inverse median difference regression
- Sufficient dimension reduction via principal L\(q\) support vector machine
- Gradient-based kernel dimension reduction for regression
- Dimension reduction for functional data based on weak conditional moments
- Principal quantile regression for sufficient dimension reduction with heteroscedasticity
- Principal weighted logistic regression for sufficient dimension reduction in binary classification
- A robust proposal of estimation for the sufficient dimension reduction problem
- Principal minimax support vector machine for sufficient dimension reduction with contaminated data
- Quantile-slicing estimation for dimension reduction in regression
- An RKHS-based semiparametric approach to nonlinear dimension reduction
- On sufficient dimension reduction for functional data: inverse moment-based methods
- The maximum separation subspace in sufficient dimension reduction with categorical response
- High-dimensional sufficient dimension reduction through principal projections
- A general theory for nonlinear sufficient dimension reduction: formulation and estimation
- A new reproducing kernel‐based nonlinear dimension reduction method for survival data
- A comparative study of the use of large margin classifiers on seismic data
- Inverse regression-based uncertainty quantification algorithms for high-dimensional models: theory and practice
- Graph informed sliced inverse regression
- On post dimension reduction statistical inference
- Deep nonlinear sufficient dimension reduction
- On a nonlinear extension of the principal fitted component model
- Statistical learning on emerging economies
- On an additive semigraphoid model for statistical networks with application to pathway analysis
- A cost based reweighted scheme of principal support vector machine
- On the predictive potential of kernel principal components
- Nonlinear predictive directions in clinical trials
- Using sliced inverse mean difference for sufficient dimension reduction
- On sufficient dimension reduction via principal asymmetric least squares
- Dimension reduction-based adaptive-to-model semi-supervised classification
- Gradient-based approach to sufficient dimension reduction with functional or longitudinal covariates
- An ensemble of inverse moment estimators for sufficient dimension reduction
- Sliced Inverse Regression in Metric Spaces
- A study on imbalance support vector machine algorithms for sufficient dimension reduction
- On the conditional distributions of low-dimensional projections from high-dimensional data
- Nonlinear and additive principal component analysis for functional data
- Penalized principal logistic regression for sparse sufficient dimension reduction
- On efficient dimension reduction with respect to a statistical functional of interest
- Principal weighted support vector machines for sufficient dimension reduction in binary classification
This page was built for publication: Principal support vector machines for linear and nonlinear sufficient dimension reduction
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q449992)