AdaBoost and robust one-bit compressed sensing
From MaRDI portal
Publication:2102435
DOI10.4171/MSL/31MaRDI QIDQ2102435FDOQ2102435
Authors: Geoffrey Chinot, Felix Kuchelmeister, Matthias Löffler, Sara Van De Geer
Publication date: 28 November 2022
Published in: Mathematical Statistics and Learning (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2105.02083
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Signal theory (characterization, reconstruction, filtering, etc.) (94A12)
Cites Work
- A decision-theoretic generalization of on-line learning and an application to boosting
- Title not available (Why is that?)
- Atomic Decomposition by Basis Pursuit
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Boosting for high-dimensional linear models
- Mathematical foundations of infinite-dimensional statistical models
- Boosting with early stopping: convergence and consistency
- Boosting as a regularized path to a maximum margin classifier
- Arcing classifiers. (With discussion)
- Adaptive estimation of a quadratic functional by model selection.
- Improved boosting algorithms using confidence-rated predictions
- The Integral of a Symmetric Unimodal Function over a Symmetric Convex Set and Some Probability Inequalities
- Normal Approximation by Stein’s Method
- Soft margins for AdaBoost
- Empirical margin distributions and bounding the generalization error of combined classifiers
- Moment inequalities for sums of dependent random variables under projective conditions
- On sparse reconstruction from Fourier and Gaussian measurements
- Stability and instance optimality for Gaussian measurements in compressed sensing
- Learning without concentration
- One-bit compressed sensing by linear programming
- Robust 1-Bit Compressive Sensing via Binary Stable Embeddings of Sparse Vectors
- Reconstruction and subgaussian operators in asymptotic geometric analysis
- One-bit compressed sensing with non-Gaussian measurements
- Inequalities of Bernstein-Jackson-type and the degree of compactness of operators in Banach spaces
- On the Bayes-risk consistency of regularized boosting methods.
- On the minimum of several random variables
- Population theory for boosting ensembles.
- Process consistency for AdaBoost.
- Robust 1-bit Compressed Sensing and Sparse Logistic Regression: A Convex Programming Approach
- One-bit compressive sensing of dictionary-sparse signals
- Reconciling modern machine-learning practice and the classical bias-variance trade-off
- One-Bit Compressive Sensing With Norm Estimation
- Title not available (Why is that?)
- Non-Gaussian hyperplane tessellations and robust one-bit compressed sensing
- The rate of convergence of AdaBoost
- Benign overfitting in linear regression
- The implicit bias of gradient descent on separable data
- Surprises in high-dimensional ridgeless least squares interpolation
- On the robustness of minimum norm interpolators and regularized empirical risk minimizers
- A precise high-dimensional asymptotic theory for boosting and minimum-\(\ell_1\)-norm interpolated classifiers
- Title not available (Why is that?)
- The Generalization Error of Random Features Regression: Precise Asymptotics and the Double Descent Curve
- A model of double descent for high-dimensional binary linear classification
Cited In (2)
Uses Software
This page was built for publication: AdaBoost and robust one-bit compressed sensing
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2102435)