Least square regression with indefinite kernels and coefficient regularization
From MaRDI portal
Recommendations
- Coefficient-based \(l^q\)-regularized regression with indefinite kernels and unbounded sampling
- Regularization networks with indefinite kernels
- Coefficient-based regularized regression with indefinite kernels by unbounded sampling
- Sharp learning rates of coefficient-based l^q-regularized regression with indefinite kernels
- Optimal rates for coefficient-based regularized regression
Cites work
- scientific article; zbMATH DE number 1332320 (Why is no real title available?)
- scientific article; zbMATH DE number 2115052 (Why is no real title available?)
- 10.1162/153244302760200650
- A note on application of integral operator in learning theory
- Application of integral operator for regularized least-square regression
- Best choices for regularization parameters in learning theory: on the bias-variance problem.
- Learning Bounds for Kernel Regression Using Effective Data Dimensionality
- Learning by nonsymmetric kernels with data dependent spaces and \(\ell^1\)-regularizer
- Learning rates of least-square regularized regression
- Learning the kernel function via regularization
- Learning the kernel matrix with semidefinite programming
- Learning theory estimates via integral operators and their approximations
- Learning with sample dependent hypothesis spaces
- Leave-One-Out Bounds for Kernel Methods
- Multi-kernel regularized classifiers
- On regularization algorithms in learning theory
- On the mathematical foundations of learning
- Refinable kernels
- Refinement of reproducing kernels
- Regularization networks and support vector machines
- Regularized least square regression with dependent samples
- SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming
- Shannon sampling and function reconstruction from point values
- Shannon sampling. II: Connections to learning theory
- Spectral Algorithms for Supervised Learning
- Support vector machine classification with indefinite kernels
- Theory of Reproducing Kernels
Cited in
(51)- Kernel-based sparse regression with the correntropy-induced loss
- Distributed learning with partial coefficients regularization
- Ivanov-regularised least-squares estimators over large RKHSs and their interpolation spaces
- A simpler approach to coefficient regularized support vector machines regression
- Indefinite kernel network with \(l^q\)-norm regularization
- Distributed robust regression with correntropy losses and regularization kernel networks
- Nyström subsampling method for coefficient-based regularized regression
- Distributed learning with indefinite kernels
- An empirical feature-based learning algorithm producing sparse approximations
- Support vector machines regression with l^1-regularizer
- Reproducing kernel Banach spaces with the \(\ell^{1}\) norm. II: Error analysis for regularized least square regression
- Learning rates for least square regressions with coefficient regularization
- Kernelized elastic net regularization: generalization bounds, and sparse recovery
- Learning from regularized regression algorithms with \(p\)-order Markov chain sampling
- Coefficient-based regression with non-identical unbounded sampling
- Statistical consistency of coefficient-based conditional quantile regression
- Iterative kernel regression with preconditioning
- Consistency analysis of spectral regularization algorithms
- Regularized modal regression with data-dependent hypothesis spaces
- Unified approach to coefficient-based regularized regression
- Distributed regression learning with coefficient regularization
- Half supervised coefficient regularization for regression learning with unbounded sampling
- Indefinite kernel network with dependent sampling
- Coefficient-based regularized regression with indefinite kernels by unbounded sampling
- Convergence rate of the semi-supervised greedy algorithm
- Coefficient regularized regression with non-iid sampling
- Error Analysis of Coefficient-Based Regularized Algorithm for Density-Level Detection
- On the \(K\)-functional in learning theory
- Coefficient-based \(l^q\)-regularized regression with indefinite kernels and unbounded sampling
- Coefficient-based regularized distribution regression
- Supervised learning with indefinite topological Kernels
- Learning with convex loss and indefinite kernels
- Convergence rate of kernel canonical correlation analysis
- Regularization networks with indefinite kernels
- Conditional quantiles with varying Gaussians
- Indefinite kernels in least squares support vector machines and principal component analysis
- Learning rate of distribution regression with dependent samples
- Distributed semi-supervised regression learning with coefficient regularization
- Constructive analysis for coefficient regularization regression algorithms
- Coefficient-based regularization network with variance loss for error
- Optimal rates for coefficient-based regularized regression
- Regularized least square regression with unbounded and dependent sampling
- Regression learning with non-identically and non-independently sampling
- Sharp learning rates of coefficient-based l^q-regularized regression with indefinite kernels
- Distributed learning and distribution regression of coefficient regularization
- Learning Rates of lq Coefficient Regularization Learning with Gaussian Kernel
- The learning rate of \(l_2\)-coefficient regularized classification with strong loss
- Convergence analysis of coefficient-based regularization under moment incremental condition
- Splines with non positive kernels
- Least squares regression with \(l_1\)-regularizer in sum space
- Learning rates of multitask kernel methods
This page was built for publication: Least square regression with indefinite kernels and coefficient regularization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q617706)