AdaBoost and robust one-bit compressed sensing
From MaRDI portal
Publication:2102435
Recommendations
- Robust one-bit compressed sensing with partial circulant matrices
- Random classification noise defeats all convex potential boosters
- Boosting in the presence of outliers: adaptive classification with nonconvex loss functions
- An approach to one-bit compressed sensing based on probably approximately correct learning theory
- Robust 1-bit compressed sensing via hinge loss minimization
Cites work
- scientific article; zbMATH DE number 2107836 (Why is no real title available?)
- scientific article; zbMATH DE number 7626737 (Why is no real title available?)
- scientific article; zbMATH DE number 6781369 (Why is no real title available?)
- A decision-theoretic generalization of on-line learning and an application to boosting
- A model of double descent for high-dimensional binary linear classification
- A precise high-dimensional asymptotic theory for boosting and minimum-\(\ell_1\)-norm interpolated classifiers
- Adaptive estimation of a quadratic functional by model selection.
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Arcing classifiers. (With discussion)
- Atomic Decomposition by Basis Pursuit
- Benign overfitting in linear regression
- Boosting as a regularized path to a maximum margin classifier
- Boosting for high-dimensional linear models
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Boosting with early stopping: convergence and consistency
- Empirical margin distributions and bounding the generalization error of combined classifiers
- Improved boosting algorithms using confidence-rated predictions
- Inequalities of Bernstein-Jackson-type and the degree of compactness of operators in Banach spaces
- Learning without concentration
- Mathematical foundations of infinite-dimensional statistical models
- Moment inequalities for sums of dependent random variables under projective conditions
- Non-Gaussian hyperplane tessellations and robust one-bit compressed sensing
- Normal Approximation by Stein’s Method
- On sparse reconstruction from Fourier and Gaussian measurements
- On the Bayes-risk consistency of regularized boosting methods.
- On the minimum of several random variables
- On the robustness of minimum norm interpolators and regularized empirical risk minimizers
- One-Bit Compressive Sensing With Norm Estimation
- One-bit compressed sensing by linear programming
- One-bit compressed sensing with non-Gaussian measurements
- One-bit compressive sensing of dictionary-sparse signals
- Population theory for boosting ensembles.
- Process consistency for AdaBoost.
- Reconciling modern machine-learning practice and the classical bias-variance trade-off
- Reconstruction and subgaussian operators in asymptotic geometric analysis
- Robust 1-Bit Compressive Sensing via Binary Stable Embeddings of Sparse Vectors
- Robust 1-bit Compressed Sensing and Sparse Logistic Regression: A Convex Programming Approach
- Soft margins for AdaBoost
- Stability and instance optimality for Gaussian measurements in compressed sensing
- Surprises in high-dimensional ridgeless least squares interpolation
- The Generalization Error of Random Features Regression: Precise Asymptotics and the Double Descent Curve
- The Integral of a Symmetric Unimodal Function over a Symmetric Convex Set and Some Probability Inequalities
- The implicit bias of gradient descent on separable data
- The rate of convergence of AdaBoost
Cited in
(2)
This page was built for publication: AdaBoost and robust one-bit compressed sensing
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2102435)