Manifold regularization based on Nyström type subsampling
From MaRDI portal
Abstract: In this paper, we study the Nystr{"o}m type subsampling for large scale kernel methods to reduce the computational complexities of big data. We discuss the multi-penalty regularization scheme based on Nystr{"o}m type subsampling which is motivated from well-studied manifold regularization schemes. We develop a theoretical analysis of multi-penalty least-square regularization scheme under the general source condition in vector-valued function setting, therefore the results can also be applied to multi-task learning problems. We achieve the optimal minimax convergence rates of multi-penalty regularization using the concept of effective dimension for the appropriate subsampling size. We discuss an aggregation approach based on linear function strategy to combine various Nystr{"o}m approximants. Finally, we demonstrate the performance of multi-penalty regularization based on Nystr{"o}m type subsampling on Caltech-101 data set for multi-class image classification and NSL-KDD benchmark data set for intrusion detection problem.
Recommendations
- Nyström type subsampling analyzed as a regularized projection
- Nyström subsampling method for coefficient-based regularized regression
- Regularized Nyström subsampling in regression and ranking problems under general smoothness assumptions
- Analysis of regularized Nyström subsampling for regression functions of low smoothness
- Nyström-based approximate kernel subspace learning
Cites work
- scientific article; zbMATH DE number 5957307 (Why is no real title available?)
- scientific article; zbMATH DE number 3907465 (Why is no real title available?)
- scientific article; zbMATH DE number 893887 (Why is no real title available?)
- A linear functional strategy for regularized ranking
- A unifying framework in vector-valued reproducing kernel Hilbert spaces for manifold regularization and co-regularized multi-view learning
- Adaptive kernel methods using the balancing principle
- Aggregation of regularized solutions from multiple observation models
- Convex multi-task feature learning
- Design of Non-Linear Kernel Dictionaries for Object Recognition
- Direct estimation of linear functionals from indirect noisy observations
- Discretization strategy for linear ill-posed problems in variable Hilbert scales
- Geometry of linear ill-posed problems in variable Hilbert scales
- Group-Sensitive Multiple Kernel Learning for Object Recognition
- Learning deep architectures for AI
- Learning from examples as an inverse problem
- Learning multiple tasks with kernel methods
- Learning the kernel function via regularization
- Learning theory of distributed spectral algorithms
- Manifold regularization: a geometric framework for learning from labeled and unlabeled examples
- Marketing models of consumer heterogeneity
- Multi-penalty regularization in learning theory
- Multi-task learning via linear functional strategy
- Multiple Kernel Sparse Representations for Supervised and Unsupervised Learning
- Neural Network Learning as an Inverse Problem
- Nyström type subsampling analyzed as a regularized projection
- On Learning Vector-Valued Functions
- On regularization algorithms in learning theory
- On the mathematical foundations of learning
- Optimal rates for the regularized least-squares algorithm
- Pathway-based kernel boosting for the analysis of genome-wide association studies
- Practical Bayesian support vector regression for financial time series prediction and market condition change detection
- Structured Kernel Dictionary Learning With Correlation Constraint for Object Recognition
- Theory of Reproducing Kernels
Cited in
(6)- Analysis of regularized Nyström subsampling for regression functions of low smoothness
- Regularized Nyström subsampling in regression and ranking problems under general smoothness assumptions
- Nyström subsampling method for coefficient-based regularized regression
- Diversity sampling is an implicit regularization for kernel methods
- Nyström type subsampling analyzed as a regularized projection
- Nyström landmark sampling and regularized Christoffel functions
This page was built for publication: Manifold regularization based on Nyström type subsampling
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2175018)