Editorial: Statistical learning methods including dimensionality reduction
From MaRDI portal
Computational methods for problems pertaining to statistics (62-08) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Ridge regression; shrinkage estimators (Lasso) (62J07) Collections of articles of miscellaneous specific interest (00B15) Proceedings, conferences, collections, etc. pertaining to statistics (62-06)
Cites work
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A mixture model for the classification of three-way proximity data
- A simple and efficient method for variable ranking according to their usefulness for learning
- A unifying model involving a categorical and/or dimensional reduction for multimode data
- An even faster algorithm for ridge regression of reduced rank data
- Boosting ridge regression
- Class prediction and gene selection for DNA microarrays using regularized sliced inverse regression
- Combined use of association rules mining and clustering methods to find relevant links between binary rare attributes in a large data set
- DALASS: variable selection in discriminant analysis via the LASSO
- Dimension reduction via principal variables
- High-dimensional data clustering
- High-dimensional pseudo-logistic regression and classification with applications to gene expression data
- Improving implementation of linear discriminant analysis for the high dimension/small sample size problem
- Input selection and shrinkage in multiresponse linear regression
- Kernel logistic PLS: a tool for supervised nonlinear dimensionality reduction and binary classifi\-cation
- Model-Based Gaussian and Non-Gaussian Clustering
- Model-based methods to identify multiple cluster structures in a data set
- Multivariable regression model building by using fractional polynomials: description of SAS, STATA and R programs
- Non-symmetric correspondence analysis with ordinal variables using orthogonal polynomials
- Parsimonious additive models
- Projected gradient approach to the numerical solution of the SCoTLASS
- Regularized linear and kernel redundancy analysis
- Relaxed Lasso
- Robust variable selection using least angle regression and elemental set sampling
- Unbiased split selection for classification trees based on the Gini index
- Unbiased variable selection for classification trees with multivariate responses
- Variable selection in kernel Fisher discriminant analysis by means of recursive feature elimina\-tion
Cited in
(3)
This page was built for publication: Editorial: Statistical learning methods including dimensionality reduction
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1020825)