Pages that link to "Item:Q1959089"
From MaRDI portal
The following pages link to Adaptive kernel methods using the balancing principle (Q1959089):
Displaying 27 items.
- Multi-penalty regularization in learning theory (Q306697) (← links)
- Multi-parameter regularization and its numerical realization (Q537875) (← links)
- Fast cross-validation in harmonic approximation (Q778015) (← links)
- Optimal learning rates for kernel partial least squares (Q1645280) (← links)
- A linear functional strategy for regularized ranking (Q1669294) (← links)
- A meta-learning approach to the regularized learning -- case study: blood glucose prediction (Q1941597) (← links)
- The Goldenshluger-Lepski method for constrained least-squares estimators over RKHSs (Q1983602) (← links)
- On a regularization of unsupervised domain adaptation in RKHS (Q2075006) (← links)
- Concentration inequalities for cross-validation in scattered data approximation (Q2120815) (← links)
- On an unsupervised method for parameter selection for the elastic net (Q2167641) (← links)
- Manifold regularization based on Nyström type subsampling (Q2175018) (← links)
- Kernel gradient descent algorithm for information theoretic learning (Q2223567) (← links)
- Distributed learning and distribution regression of coefficient regularization (Q2223571) (← links)
- Balancing principle in supervised learning for a general regularization scheme (Q2278452) (← links)
- Multi-task learning via linear functional strategy (Q2407408) (← links)
- A Lepskiĭ-type stopping rule for the covariance estimation of multi-dimensional Lévy processes (Q2676877) (← links)
- About the Balancing Principle for Choice of the Regularization Parameter (Q3405615) (← links)
- Gradient descent for robust kernel-based regression (Q4571003) (← links)
- (Q4637006) (← links)
- Regularized Nyström subsampling in regression and ranking problems under general smoothness assumptions (Q4968723) (← links)
- (Q4998979) (← links)
- Error analysis of the kernel regularized regression based on refined convex losses and RKBSs (Q5022936) (← links)
- Optimal Convergence of the Discrepancy Principle for Polynomially and Exponentially Ill-Posed Operators under White Noise (Q5073867) (← links)
- A machine learning approach to optimal Tikhonov regularization I: Affine manifolds (Q5077184) (← links)
- On the K-functional in learning theory (Q5107666) (← links)
- Robust kernel-based distribution regression (Q5157866) (← links)
- A Study on Regularization for Discrete Inverse Problems with Model-Dependent Noise (Q5359494) (← links)