Early Stopping for Kernel Boosting Algorithms: A General Analysis With Localized Complexities
DOI10.1109/TIT.2019.2927563zbMATH Open1432.62115arXiv1707.01543OpenAlexW2961238573MaRDI QIDQ5211468FDOQ5211468
Martin J. Wainwright, Fanny Yang, Yuting Wei
Publication date: 28 January 2020
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1707.01543
Nonparametric regression and quantile regression (62G08) Asymptotic properties of nonparametric inference (62G20) Ridge regression; shrinkage estimators (Lasso) (62J07) Generalized linear models (logistic models) (62J12) Order statistics; empirical distribution functions (62G30) Optimal stopping in statistics (62L15)
Cited In (7)
- Title not available (Why is that?)
- Learning the mapping \(\mathbf{x}\mapsto \sum\limits_{i=1}^d x_i^2\): the cost of finding the needle in a haystack
- Smoothed residual stopping for statistical inverse problems via truncated SVD estimation
- 10.1162/jmlr.2003.3.4-5.863
- Fully corrective gradient boosting with squared hinge: fast learning rates and early stopping
- From inexact optimization to learning via gradient concentration
- Toward Efficient Ensemble Learning with Structure Constraints: Convergent Algorithms and Applications
This page was built for publication: Early Stopping for Kernel Boosting Algorithms: A General Analysis With Localized Complexities
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5211468)