Obtaining fast error rates in nonconvex situations
From MaRDI portal
Publication:933417
DOI10.1016/j.jco.2007.09.001zbMath1338.60112OpenAlexW1989007039MaRDI QIDQ933417
Publication date: 21 July 2008
Published in: Journal of Complexity (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jco.2007.09.001
Approximations to statistical distributions (nonasymptotic) (62E17) Applications of functional analysis in probability theory and statistics (46N30) Prediction theory (aspects of stochastic processes) (60G25)
Related Items
Regularization in kernel learning, Learning without concentration for general loss functions, Unnamed Item, On the optimality of the empirical risk minimization procedure for the convex aggregation problem, General oracle inequalities for model selection, On the optimality of the aggregate with exponential weights for low temperatures, General nonexact oracle inequalities for classes with a subexponential envelope, Error analysis of multicategory support vector machine classifiers
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On the uniform convexity of \(L^p\) and \(l^p\)
- Convexity of Chebyshev sets
- Optimal aggregation of classifiers in statistical learning.
- Weak convergence and empirical processes. With applications to statistics
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Empirical minimization
- Local Rademacher complexities
- Theory of Classification: a Survey of Some Recent Advances
- Lower Bounds for the Empirical Minimization Algorithm
- APPROXIMATIVE PROPERTIES OF SETS IN NORMED LINEAR SPACES
- Uniform Central Limit Theorems
- Scale-sensitive dimensions, uniform convergence, and learnability
- The importance of convexity in learning with squared loss
- Neural Network Learning