Pages that link to "Item:Q4701173"
From MaRDI portal
The following pages link to The importance of convexity in learning with squared loss (Q4701173):
Displaying 20 items.
- Tikhonov, Ivanov and Morozov regularization for support vector machine learning (Q285946) (← links)
- General nonexact oracle inequalities for classes with a subexponential envelope (Q447832) (← links)
- Sharper lower bounds on the performance of the empirical risk minimization algorithm (Q637070) (← links)
- Aggregation via empirical risk minimization (Q842390) (← links)
- Regularization in kernel learning (Q847647) (← links)
- Obtaining fast error rates in nonconvex situations (Q933417) (← links)
- Boosting the margin: a new explanation for the effectiveness of voting methods (Q1807156) (← links)
- Efficient algorithms for learning functions with bounded variation (Q1887165) (← links)
- General oracle inequalities for model selection (Q1951973) (← links)
- On universal estimators in learning theory (Q2342272) (← links)
- Fast learning from \(\alpha\)-mixing observations (Q2443266) (← links)
- Local Rademacher complexities (Q2583411) (← links)
- On the mathematical foundations of learning (Q2761194) (← links)
- On Martingale Extensions of Vapnik–Chervonenkis Theory with Applications to Online Learning (Q2805727) (← links)
- On the Optimality of Sample-Based Estimates of the Expectation of the Empirical Minimizer (Q3085585) (← links)
- Large-Scale Machine Learning with Stochastic Gradient Descent (Q3298463) (← links)
- Theory of Classification: a Survey of Some Recent Advances (Q3373749) (← links)
- Optimization Methods for Large-Scale Machine Learning (Q4641709) (← links)
- Learning Bounds for Kernel Regression Using Effective Data Dimensionality (Q5706660) (← links)
- Nonexact oracle inequalities, \(r\)-learnability, and fast rates (Q6149162) (← links)