Principal support vector machines for linear and nonlinear sufficient dimension reduction
From MaRDI portal
(Redirected from Publication:449992)
principal componentsreproducing kernel Hilbert spaceinverse regressioncontour regressioninvariant kernel
Nonparametric regression and quantile regression (62G08) Factor analysis and principal components; correspondence analysis (62H25) Learning and adaptive systems in artificial intelligence (68T05) Estimation in multivariate analysis (62H12) Applications of functional analysis in probability theory and statistics (46N30) Graphical methods in statistics (62A09)
Abstract: We introduce a principal support vector machine (PSVM) approach that can be used for both linear and nonlinear sufficient dimension reduction. The basic idea is to divide the response variables into slices and use a modified form of support vector machine to find the optimal hyperplanes that separate them. These optimal hyperplanes are then aligned by the principal components of their normal vectors. It is proved that the aligned normal vectors provide an unbiased, -consistent, and asymptotically normal estimator of the sufficient dimension reduction space. The method is then generalized to nonlinear sufficient dimension reduction using the reproducing kernel Hilbert space. In that context, the aligned normal vectors become functions and it is proved that they are unbiased in the sense that they are functions of the true nonlinear sufficient predictors. We compare PSVM with other sufficient dimension reduction methods by simulation and in real data analysis, and through both comparisons firmly establish its practical advantages.
Recommendations
- Sufficient dimension reduction via principal L\(q\) support vector machine
- Principal weighted support vector machines for sufficient dimension reduction in binary classification
- A cost based reweighted scheme of principal support vector machine
- A general theory for nonlinear sufficient dimension reduction: formulation and estimation
- High-dimensional sufficient dimension reduction through principal projections
Cites work
- scientific article; zbMATH DE number 5957198 (Why is no real title available?)
- scientific article; zbMATH DE number 5968880 (Why is no real title available?)
- scientific article; zbMATH DE number 3676608 (Why is no real title available?)
- scientific article; zbMATH DE number 47995 (Why is no real title available?)
- scientific article; zbMATH DE number 1220060 (Why is no real title available?)
- scientific article; zbMATH DE number 1332320 (Why is no real title available?)
- scientific article; zbMATH DE number 490141 (Why is no real title available?)
- scientific article; zbMATH DE number 1779487 (Why is no real title available?)
- scientific article; zbMATH DE number 1850469 (Why is no real title available?)
- A characterization of spherical distributions
- A nonlinear multi-dimensional variable selection method for high dimensional data: sparse MAVE
- Algorithmic Learning Theory
- An Adaptive Estimation of Dimension Reduction Space
- An RKHS formulation of the inverse regression dimension-reduction problem
- Asymptotic Statistics
- Asymptotics for sliced average variance estimation
- Comment
- Contour regression: a general approach to dimension reduction
- Convex functional analysis
- Dimension Reduction for the Conditionalkth Moment in Regression
- Dimension reduction for conditional mean in regression
- Dimension reduction for nonelliptically distributed predictors
- Estimating the dimension of a model
- Fisher lecture: Dimension reduction in regression
- Graphics for Regressions With a Binary Response
- Kernel dimension reduction in regression
- Nonparametric estimating equations based on a penalized information criterion
- On Directional Regression for Dimension Reduction
- On Principal Hessian Directions for Data Visualization and Dimension Reduction: Another Application of Stein's Lemma
- On Quasi Likelihood Equations with Non-parametric Weights
- On Sliced Inverse Regression With High-Dimensional Covariates
- On almost linearity of low dimensional projections from high dimensional data
- On the distribution of the left singular vectors of a random matrix and its applications
- Principal fitted components for dimension reduction in regression
- Regression analysis under link violation
- Sliced Inverse Regression for Dimension Reduction
- Successive direction extraction for estimating the central subspace in a multiple-index regres\-sion
- Sufficient Dimension Reduction via Inverse Regression
- The commutation matrix: Some properties and applications
- Theory of Reproducing Kernels
Cited in
(46)- On an additive semigraphoid model for statistical networks with application to pathway analysis
- On a nonlinear extension of the principal fitted component model
- Nonlinear predictive directions in clinical trials
- Sufficient dimension reduction via principal L\(q\) support vector machine
- Predictive power of principal components for single-index model and sufficient dimension reduction
- On sufficient dimension reduction for functional data: inverse moment-based methods
- Principal weighted logistic regression for sufficient dimension reduction in binary classification
- Dimension reduction in binary response regression: a joint modeling approach
- On expectile-assisted inverse regression estimation for sufficient dimension reduction
- A study on imbalance support vector machine algorithms for sufficient dimension reduction
- Dimension reduction for functional data based on weak conditional moments
- Quantile-slicing estimation for dimension reduction in regression
- Statistical learning on emerging economies
- Using adaptively weighted large margin classifiers for robust sufficient dimension reduction
- Using sliced inverse mean difference for sufficient dimension reduction
- Sliced Inverse Regression in Metric Spaces
- Penalized principal logistic regression for sparse sufficient dimension reduction
- Graph informed sliced inverse regression
- A brief review of linear sufficient dimension reduction through optimization
- Using DAGs to identify the sufficient dimension reduction in the principal fitted components model
- Deep nonlinear sufficient dimension reduction
- High-dimensional sufficient dimension reduction through principal projections
- The maximum separation subspace in sufficient dimension reduction with categorical response
- On the conditional distributions of low-dimensional projections from high-dimensional data
- Principal weighted support vector machines for sufficient dimension reduction in binary classification
- A robust proposal of estimation for the sufficient dimension reduction problem
- A comparative study of the use of large margin classifiers on seismic data
- Gradient-based kernel dimension reduction for regression
- Principal quantile regression for sufficient dimension reduction with heteroscedasticity
- A general theory for nonlinear sufficient dimension reduction: formulation and estimation
- Sliced inverse median difference regression
- Dimension reduction techniques for conditional expectiles
- On post dimension reduction statistical inference
- Nonlinear dimension reduction for conditional quantiles
- Inverse regression-based uncertainty quantification algorithms for high-dimensional models: theory and practice
- On sufficient dimension reduction via principal asymmetric least squares
- On the predictive potential of kernel principal components
- A cost based reweighted scheme of principal support vector machine
- Dimension reduction-based adaptive-to-model semi-supervised classification
- Gradient-based approach to sufficient dimension reduction with functional or longitudinal covariates
- On efficient dimension reduction with respect to a statistical functional of interest
- Nonlinear and additive principal component analysis for functional data
- Principal minimax support vector machine for sufficient dimension reduction with contaminated data
- An ensemble of inverse moment estimators for sufficient dimension reduction
- A new reproducing kernel‐based nonlinear dimension reduction method for survival data
- An RKHS-based semiparametric approach to nonlinear dimension reduction
This page was built for publication: Principal support vector machines for linear and nonlinear sufficient dimension reduction
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q449992)