Agnostic Learning of Monomials by Halfspaces Is Hard
From MaRDI portal
Publication:4910575
DOI10.1137/120865094zbMath1261.68063arXiv1012.0729OpenAlexW2038663599MaRDI QIDQ4910575
Vitaly Feldman, Yi Wu, Prasad Raghavendra, Venkatesan Guruswami
Publication date: 19 March 2013
Published in: SIAM Journal on Computing, 2009 50th Annual IEEE Symposium on Foundations of Computer Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1012.0729
Computational learning theory (68Q32) Computational difficulty of problems (lower bounds, completeness, difficulty of approximation, etc.) (68Q17)
Related Items (17)
Tikhonov, Ivanov and Morozov regularization for support vector machine learning ⋮ Nearly Optimal Solutions for the Chow Parameters Problem and Low-Weight Approximation of Halfspaces ⋮ Regularizing conjunctive features for classification ⋮ Complexity and approximability of parameterized MAX-CSPs ⋮ A Fisher consistent multiclass loss function with variable margin on positive examples ⋮ On the hardness of learning intersections of two halfspaces ⋮ A nonlinear kernel SVM Classifier via \(L_{0/1}\) soft-margin loss with classification performance ⋮ \(\alpha\)QBoost: an iteratively weighted adiabatic trained classifier ⋮ Robust classification via MOM minimization ⋮ Kernel variable selection for multicategory support vector machines ⋮ Three candidate plurality is stablest for small correlations ⋮ Gaussian bounds for noise correlation of resilient functions ⋮ The Quest for Strong Inapproximability Results with Perfect Completeness ⋮ Unnamed Item ⋮ Provably training overparameterized neural network classifiers with non-convex constraints ⋮ Quadratic Convergence of Smoothing Newton's Method for 0/1 Loss Optimization ⋮ Tight approximation bounds for maximum multi-coverage
This page was built for publication: Agnostic Learning of Monomials by Halfspaces Is Hard