Self-improved gaps almost everywhere for the agnostic approximation of monomials
From MaRDI portal
(Redirected from Publication:884469)
Recommendations
Cites work
- scientific article; zbMATH DE number 3639144 (Why is no real title available?)
- scientific article; zbMATH DE number 1256748 (Why is no real title available?)
- scientific article; zbMATH DE number 1303029 (Why is no real title available?)
- scientific article; zbMATH DE number 2080452 (Why is no real title available?)
- A decision-theoretic generalization of on-line learning and an application to boosting
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Clique is hard to approximate within \(n^{1-\epsilon}\)
- Complexity in the case against accuracy estimation
- Improved boosting algorithms using confidence-rated predictions
- Linear degree extractors and the inapproximability of max clique and chromatic number
- Lower bounds on learning decision lists and trees
- Maximizing agreements and coagnostic learning
- On the limits of proper learnability of subclasses of DNF formulas
- Robust trainability of single neurons
- Toward efficient agnostic learning
- Very simple classification rules perform well on most commonly used datasets
This page was built for publication: Self-improved gaps almost everywhere for the agnostic approximation of monomials
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q884469)