The following pages link to Regularization in kernel learning (Q847647):
Displaying 45 items.
- Integral operator approach to learning theory with unbounded sampling (Q371679) (← links)
- Learning with coefficient-based regularization and \(\ell^1\)-penalty (Q380980) (← links)
- Mercer's theorem on general domains: on the interaction between measures, kernels, and RKHSs (Q431161) (← links)
- Consistency of support vector machines using additive kernels for additive models (Q433246) (← links)
- General nonexact oracle inequalities for classes with a subexponential envelope (Q447832) (← links)
- A short note on the comparison of interpolation widths, entropy numbers, and Kolmogorov widths (Q505890) (← links)
- Optimal learning rates for least squares regularized regression with unbounded sampling (Q617656) (← links)
- Estimating conditional quantiles with the help of the pinball loss (Q637098) (← links)
- Optimal rates for regularization of statistical inverse learning problems (Q667648) (← links)
- Learning rates for kernel-based expectile regression (Q669274) (← links)
- Asymptotic normality of support vector machine variants and other regularized kernel methods (Q765834) (← links)
- \(\ell _{1}\)-regularized linear regression: persistence and oracle inequalities (Q1930861) (← links)
- Concentration estimates for learning with unbounded sampling (Q1946480) (← links)
- Optimal regression rates for SVMs using Gaussian kernels (Q1951100) (← links)
- Coefficient-based regression with non-identical unbounded sampling (Q2016624) (← links)
- Distributed regularized least squares with flexible Gaussian kernels (Q2036424) (← links)
- On the speed of uniform convergence in Mercer's theorem (Q2091033) (← links)
- ERM and RERM are optimal estimators for regression problems when malicious outliers corrupt the labels (Q2209821) (← links)
- Kernel variable selection for multicategory support vector machines (Q2237819) (← links)
- Fast and strong convergence of online learning algorithms (Q2305549) (← links)
- Fast learning from \(\alpha\)-mixing observations (Q2443266) (← links)
- Least-squares regularized regression with dependent samples and<i>q</i>-penalty (Q2903163) (← links)
- Convergence Rates of Spectral Regularization Methods: A Comparison between Ill-Posed Inverse Problems and Statistical Kernel Learning (Q3386994) (← links)
- (Q4637006) (← links)
- (Q4969211) (← links)
- Multikernel Regression with Sparsity Constraint (Q4999353) (← links)
- Efficient kernel-based variable selection with sparsistency (Q5037806) (← links)
- (Q5148996) (← links)
- (Q5159408) (← links)
- (Q5159455) (← links)
- Learning Rates of <i>l<sup>q</sup></i> Coefficient Regularization Learning with Gaussian Kernel (Q5175497) (← links)
- Ivanov-Regularised Least-Squares Estimators over Large RKHSs and Their Interpolation Spaces (Q5214209) (← links)
- (Q5214255) (← links)
- Generalized support vector regression: Duality and tensor-kernel representation (Q5220070) (← links)
- Thresholded spectral algorithms for sparse approximations (Q5267950) (← links)
- Learning with Convex Loss and Indefinite Kernels (Q5378314) (← links)
- Learning Theory Estimates with Observations from General Stationary Stochastic Processes (Q5380606) (← links)
- Optimal learning with Gaussians and correntropy loss (Q5856264) (← links)
- Convex regularization in statistical inverse learning problems (Q6115632) (← links)
- Orthogonal statistical learning (Q6136574) (← links)
- Measuring Complexity of Learning Schemes Using Hessian-Schatten Total Variation (Q6171683) (← links)
- Structure learning via unstructured kernel-based M-estimation (Q6184881) (← links)
- Nonparametric augmented probability weighting with sparsity (Q6554241) (← links)
- Least squares approximations in linear statistical inverse learning problems (Q6601195) (← links)
- Spectral regularized Kernel two-sample tests (Q6608681) (← links)