How to solve classification and regression problems on high-dimensional data with a supervised extension of slow feature analysis
zbMATH Open1318.62209MaRDI QIDQ2933969FDOQ2933969
Authors: Alberto N. Escalante-B., Laurenz Wiskott
Publication date: 8 December 2014
Full work available at URL: http://jmlr.csail.mit.edu/papers/v14/escalante13a.html
Recommendations
- Improved graph-based SFA: information preservation complements the slowness principle
- Theoretical analysis of the optimal free responses of graph-based SFA for the design of training graphs
- Supervised principal component analysis: visualization, classification and regression on subspaces and submanifolds
- A novel supervised dimensionality reduction algorithm: graph-based Fisher analysis
- On the relation of slow feature analysis and Laplacian eigenmaps
supervised learningclassificationhigh-dimensional dataregressionnonlinear dimensionality reductionpattern recognitionimage analysisfeature analysisfeature extractionimplicitly supervisedtraining graphs
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05) Pattern recognition, speech recognition (68T10)
Cited In (4)
- Theoretical analysis of the optimal free responses of graph-based SFA for the design of training graphs
- Generating feature spaces for linear algorithms with regularized sparse kernel slow feature analysis
- Improved graph-based SFA: information preservation complements the slowness principle
- Graph-based predictable feature analysis
Uses Software
This page was built for publication: How to solve classification and regression problems on high-dimensional data with a supervised extension of slow feature analysis
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2933969)