Generalization errors of Laplacian regularized least squares regression
DOI10.1007/S11425-012-4438-3zbMATH Open1258.62049OpenAlexW2254867724MaRDI QIDQ1933952FDOQ1933952
Authors: Ying Cao, Di-Rong Chen
Publication date: 28 January 2013
Published in: Science China. Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11425-012-4438-3
Recommendations
- The performance of semi-supervised Laplacian regularized regression with the least square loss
- Generalization performance of graph-based semi-supervised classification
- The convergence rate of semi-supervised regression with quadratic loss
- Convergence rate of semi-supervised gradient learning algorithms
- Performance analysis of the LapRSSLG algorithm in learning theory
Nonparametric regression and quantile regression (62G08) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05) Applications of functional analysis in probability theory and statistics (46N30)
Cites Work
- Theory of Reproducing Kernels
- Towards a theoretical foundation for Laplacian-based manifold methods
- Learning Theory
- Consistency of spectral clustering
- On the mathematical foundations of learning
- Title not available (Why is that?)
- Semi-supervised learning on Riemannian manifolds
- Shannon sampling. II: Connections to learning theory
- The covering number in learning theory
- Learning rates of least-square regularized regression
- Capacity of reproducing kernel spaces in learning theory
- Manifold regularization: a geometric framework for learning from labeled and unlabeled examples
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
- Generalization error bounds in semi-supervised classification under the cluster assumption
- Geometry on probability spaces
- On the effectiveness of Laplacian normalization for graph semi-supervised learning
- Graph-Based Semi-Supervised Learning and Spectral Kernel Design
- Consistency of regularized spectral clustering
Cited In (21)
- Error analysis for the sparse graph-based semi-supervised classification algorithm
- Convergence rate of SVM for kernel-based robust regression
- Sparse semi-supervised learning using conjugate functions
- Manifold regularization and semi-supervised learning: some theoretical analyses
- Performance analysis of the LapRSSLG algorithm in learning theory
- Semi-supervised learning for regression based on the diffusion matrix
- Convergence rate of semi-supervised gradient learning algorithms
- Causal learning via manifold regularization
- Entropy controlled Laplacian regularization for least square regression
- Community detection in complex networks: from statistical foundations to data science applications
- Generalization analysis of Fredholm kernel regularized classifiers
- Convergence rate of the semi-supervised greedy algorithm
- Weighted co-association rate-based Laplacian regularized label description for semi-supervised regression
- The performance of semi-supervised Laplacian regularized regression with the least square loss
- AN ERROR ANALYSIS OF LAVRENTIEV REGULARIZATION IN LEARNING THEORY
- Generalization performance of graph-based semi-supervised classification
- Semi-supervised learning with regularized Laplacian
- The convergence rate of semi-supervised regression with quadratic loss
- Error bounds of multi-graph regularized semi-supervised classification
- Generalization error bounds in semi-supervised classification under the cluster assumption
- Block-regularized repeated learning-testing for estimating generalization error
This page was built for publication: Generalization errors of Laplacian regularized least squares regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1933952)