Self-improved gaps almost everywhere for the agnostic approximation of monomials
From MaRDI portal
Publication:884469
DOI10.1016/J.TCS.2007.02.023zbMATH Open1117.68040OpenAlexW2135781605MaRDI QIDQ884469FDOQ884469
Authors: Richard Nock, Frank Nielsen
Publication date: 6 June 2007
Published in: Theoretical Computer Science (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.tcs.2007.02.023
Recommendations
Cites Work
- A decision-theoretic generalization of on-line learning and an application to boosting
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Very simple classification rules perform well on most commonly used datasets
- Improved boosting algorithms using confidence-rated predictions
- Title not available (Why is that?)
- Clique is hard to approximate within \(n^{1-\epsilon}\)
- Linear degree extractors and the inapproximability of max clique and chromatic number
- Title not available (Why is that?)
- Toward efficient agnostic learning
- Robust trainability of single neurons
- On the limits of proper learnability of subclasses of DNF formulas
- Lower bounds on learning decision lists and trees
- Maximizing agreements and coagnostic learning
- Complexity in the case against accuracy estimation
- Title not available (Why is that?)
- Title not available (Why is that?)
Uses Software
This page was built for publication: Self-improved gaps almost everywhere for the agnostic approximation of monomials
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q884469)