Least square regression with indefinite kernels and coefficient regularization
DOI10.1016/J.ACHA.2010.04.001zbMATH Open1225.65015OpenAlexW2045386260MaRDI QIDQ617706FDOQ617706
Authors: Hongwei Sun, Qiang Wu
Publication date: 13 January 2011
Published in: Applied and Computational Harmonic Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.acha.2010.04.001
Recommendations
- Coefficient-based \(l^q\)-regularized regression with indefinite kernels and unbounded sampling
- Regularization networks with indefinite kernels
- Coefficient-based regularized regression with indefinite kernels by unbounded sampling
- Sharp learning rates of coefficient-based \(l^q\)-regularized regression with indefinite kernels
- Optimal rates for coefficient-based regularized regression
reproducing kernel Hilbert spaceconvergencenumerical examplesintegral operatorregression functionGaussian kernelcapacity independent error boundsindefinite kernelcoefficient regularizationleast square regressionMercer kernellearning rates
General nonlinear regression (62J02) Hilbert spaces with reproducing kernels (= (proper) functional Hilbert spaces, including de Branges-Rovnyak and other structured spaces) (46E22)
Cites Work
- Regularization networks and support vector machines
- Theory of Reproducing Kernels
- On the mathematical foundations of learning
- Title not available (Why is that?)
- Learning Bounds for Kernel Regression Using Effective Data Dimensionality
- Learning the kernel matrix with semidefinite programming
- Learning the kernel function via regularization
- Shannon sampling and function reconstruction from point values
- Leave-One-Out Bounds for Kernel Methods
- Title not available (Why is that?)
- Shannon sampling. II: Connections to learning theory
- Learning with sample dependent hypothesis spaces
- Learning rates of least-square regularized regression
- Learning theory estimates via integral operators and their approximations
- Multi-kernel regularized classifiers
- Support vector machine classification with indefinite kernels
- Best choices for regularization parameters in learning theory: on the bias-variance problem.
- Spectral Algorithms for Supervised Learning
- On regularization algorithms in learning theory
- Application of integral operator for regularized least-square regression
- A note on application of integral operator in learning theory
- SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming
- Learning by nonsymmetric kernels with data dependent spaces and \(\ell^1\)-regularizer
- Regularized least square regression with dependent samples
- Refinable kernels
- Refinement of reproducing kernels
- 10.1162/153244302760200650
Cited In (51)
- Kernel-based sparse regression with the correntropy-induced loss
- A simpler approach to coefficient regularized support vector machines regression
- Indefinite kernel network with \(l^q\)-norm regularization
- Nyström subsampling method for coefficient-based regularized regression
- Distributed learning with indefinite kernels
- An empirical feature-based learning algorithm producing sparse approximations
- Reproducing kernel Banach spaces with the \(\ell^{1}\) norm. II: Error analysis for regularized least square regression
- Support vector machines regression with \(l^1\)-regularizer
- Kernelized elastic net regularization: generalization bounds, and sparse recovery
- Learning rates for least square regressions with coefficient regularization
- Learning from regularized regression algorithms with \(p\)-order Markov chain sampling
- Regularized modal regression with data-dependent hypothesis spaces
- Coefficient-based regression with non-identical unbounded sampling
- Statistical consistency of coefficient-based conditional quantile regression
- Consistency analysis of spectral regularization algorithms
- Unified approach to coefficient-based regularized regression
- Half supervised coefficient regularization for regression learning with unbounded sampling
- Indefinite kernel network with dependent sampling
- Distributed regression learning with coefficient regularization
- Coefficient-based regularized regression with indefinite kernels by unbounded sampling
- Error Analysis of Coefficient-Based Regularized Algorithm for Density-Level Detection
- Coefficient regularized regression with non-iid sampling
- Convergence rate of the semi-supervised greedy algorithm
- On the \(K\)-functional in learning theory
- Coefficient-based \(l^q\)-regularized regression with indefinite kernels and unbounded sampling
- Supervised learning with indefinite topological Kernels
- Learning with convex loss and indefinite kernels
- Convergence rate of kernel canonical correlation analysis
- Regularization networks with indefinite kernels
- Conditional quantiles with varying Gaussians
- Indefinite kernels in least squares support vector machines and principal component analysis
- Learning rate of distribution regression with dependent samples
- Distributed semi-supervised regression learning with coefficient regularization
- Coefficient-based regularization network with variance loss for error
- Constructive analysis for coefficient regularization regression algorithms
- Optimal rates for coefficient-based regularized regression
- Regression learning with non-identically and non-independently sampling
- Regularized least square regression with unbounded and dependent sampling
- Sharp learning rates of coefficient-based \(l^q\)-regularized regression with indefinite kernels
- Distributed learning and distribution regression of coefficient regularization
- Learning Rates of lq Coefficient Regularization Learning with Gaussian Kernel
- Convergence analysis of coefficient-based regularization under moment incremental condition
- Splines with non positive kernels
- The learning rate of \(l_2\)-coefficient regularized classification with strong loss
- Learning rates of multitask kernel methods
- Least squares regression with \(l_1\)-regularizer in sum space
- Ivanov-regularised least-squares estimators over large RKHSs and their interpolation spaces
- Distributed learning with partial coefficients regularization
- Distributed robust regression with correntropy losses and regularization kernel networks
- Iterative kernel regression with preconditioning
- Coefficient-based regularized distribution regression
This page was built for publication: Least square regression with indefinite kernels and coefficient regularization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q617706)