The importance of convexity in learning with squared loss
From MaRDI portal
Publication:4701173
DOI10.1109/18.705577zbMath0935.68091OpenAlexW2154451187MaRDI QIDQ4701173
Robert C. Williamson, Wee Sun Lee, Bartlett, Peter L.
Publication date: 21 November 1999
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/18.705577
Related Items
Tikhonov, Ivanov and Morozov regularization for support vector machine learning, Efficient algorithms for learning functions with bounded variation, Aggregation via empirical risk minimization, Regularization in kernel learning, Nonexact oracle inequalities, \(r\)-learnability, and fast rates, Sharper lower bounds on the performance of the empirical risk minimization algorithm, General oracle inequalities for model selection, General nonexact oracle inequalities for classes with a subexponential envelope, Fast learning from \(\alpha\)-mixing observations, Obtaining fast error rates in nonconvex situations, Large-Scale Machine Learning with Stochastic Gradient Descent, Learning Bounds for Kernel Regression Using Effective Data Dimensionality, Optimization Methods for Large-Scale Machine Learning, On the mathematical foundations of learning, On Martingale Extensions of Vapnik–Chervonenkis Theory with Applications to Online Learning, Theory of Classification: a Survey of Some Recent Advances, Boosting the margin: a new explanation for the effectiveness of voting methods, On the Optimality of Sample-Based Estimates of the Expectation of the Empirical Minimizer, On universal estimators in learning theory, Local Rademacher complexities