Pages that link to "Item:Q5706660"
From MaRDI portal
The following pages link to Learning Bounds for Kernel Regression Using Effective Data Dimensionality (Q5706660):
Displayed 16 items.
- Estimator selection in the Gaussian setting (Q141397) (← links)
- A partially linear framework for massive heterogeneous data (Q309709) (← links)
- Random design analysis of ridge regression (Q404306) (← links)
- Least square regression with indefinite kernels and coefficient regularization (Q617706) (← links)
- Importance sampling: intrinsic dimension and computational cost (Q1750255) (← links)
- Analysis of regularized least squares for functional linear regression model (Q1791683) (← links)
- General regularization schemes for signal detection in inverse problems (Q2261922) (← links)
- Optimal rates for spectral algorithms with least-squares regression over Hilbert spaces (Q2300763) (← links)
- Discrepancy based model selection in statistical inverse problems (Q2442862) (← links)
- Kernel regression, minimax rates and effective dimensionality: Beyond the regular case (Q3298576) (← links)
- Faster Kernel Ridge Regression Using Sketching and Preconditioning (Q4588937) (← links)
- (Q4637006) (← links)
- Optimal Rates for Multi-pass Stochastic Gradient Methods (Q4637012) (← links)
- Adaptive discretization for signal detection in statistical inverse problems (Q4982027) (← links)
- Analysis of regularized Nyström subsampling for regression functions of low smoothness (Q5236751) (← links)
- High-dimensional regression with unknown variance (Q5965306) (← links)