Pages that link to "Item:Q5253870"
From MaRDI portal
The following pages link to Regularization schemes for minimum error entropy principle (Q5253870):
Displaying 44 items.
- On the robustness of regularized pairwise learning methods based on kernels (Q325147) (← links)
- Kernel-based conditional canonical correlation analysis via modified Tikhonov regularization (Q326761) (← links)
- Constructive analysis for coefficient regularization regression algorithms (Q491841) (← links)
- Unregularized online learning algorithms with general loss functions (Q504379) (← links)
- Learning theory approach to a system identification problem involving atomic norm (Q895425) (← links)
- Optimal learning rates for kernel partial least squares (Q1645280) (← links)
- Supersmooth density estimations over \(L^p\) risk by wavelets (Q1703887) (← links)
- Analysis of approximation by linear operators on variable \(L_\rho^{p(\cdot)}\) spaces and applications in learning theory (Q1724144) (← links)
- Distributed kernel-based gradient descent algorithms (Q1745365) (← links)
- Fast rates of minimum error entropy with heavy-tailed noise (Q2168008) (← links)
- Distributed kernel gradient descent algorithm for minimum error entropy principle (Q2175022) (← links)
- Kernel gradient descent algorithm for information theoretic learning (Q2223567) (← links)
- Convergence of online mirror descent (Q2278461) (← links)
- Online pairwise learning algorithms with convex loss functions (Q2293252) (← links)
- On reproducing kernel and density problems (Q2358762) (← links)
- Online regularized learning with pairwise loss functions (Q2361154) (← links)
- Approximation on variable exponent spaces by linear integral operators (Q2406897) (← links)
- Kernel-based sparse regression with the correntropy-induced loss (Q2409039) (← links)
- Distributed learning with multi-penalty regularization (Q2415399) (← links)
- Error analysis on regularized regression based on the maximum correntropy criterion (Q2668572) (← links)
- On extension theorems and their connection to universal consistency in machine learning (Q2835986) (← links)
- Learning rates of regression with q-norm loss and threshold (Q2835987) (← links)
- Error bounds for learning the kernel (Q2835989) (← links)
- The performance of semi-supervised Laplacian regularized regression with the least square loss (Q2980112) (← links)
- (Q4637006) (← links)
- Learning Theory of Randomized Sparse Kaczmarz Method (Q4686926) (← links)
- (Q4969211) (← links)
- Learning theory of minimum error entropy under weak moment conditions (Q5037873) (← links)
- On the K-functional in learning theory (Q5107666) (← links)
- Online regularized pairwise learning with least squares loss (Q5220066) (← links)
- Convergence analysis of distributed multi-penalty regularized pairwise learning (Q5220068) (← links)
- Semi-supervised learning with summary statistics (Q5236748) (← links)
- Thresholded spectral algorithms for sparse approximations (Q5267950) (← links)
- Learning theory of distributed spectral algorithms (Q5348011) (← links)
- On meshfree numerical differentiation (Q5375971) (← links)
- Faster convergence of a randomized coordinate descent method for linearly constrained optimization problems (Q5375972) (← links)
- Online Pairwise Learning Algorithms (Q5380417) (← links)
- Analysis of Online Composite Mirror Descent Algorithm (Q5380674) (← links)
- Online minimum error entropy algorithm with unbounded sampling (Q5382494) (← links)
- Optimal learning with Gaussians and correntropy loss (Q5856264) (← links)
- Block coordinate type methods for optimization and learning (Q5889894) (← links)
- Necessary and sufficient optimality conditions for some robust variational problems (Q6054496) (← links)
- Rates of approximation by ReLU shallow neural networks (Q6062171) (← links)
- Error analysis of kernel regularized pairwise learning with a strongly convex loss (Q6112862) (← links)