Robust 1-bit Compressed Sensing and Sparse Logistic Regression: A Convex Programming Approach
From MaRDI portal
Publication:2989476
Abstract: This paper develops theoretical results regarding noisy 1-bit compressed sensing and sparse binomial regression. We show that a single convex program gives an accurate estimate of the signal, or coefficient vector, for both of these models. We demonstrate that an s-sparse signal in R^n can be accurately estimated from m = O(slog(n/s)) single-bit measurements using a simple convex program. This remains true even if each measurement bit is flipped with probability nearly 1/2. Worst-case (adversarial) noise can also be accounted for, and uniform results that hold for all sparse inputs are derived as well. In the terminology of sparse logistic regression, we show that O(slog(n/s)) Bernoulli trials are sufficient to estimate a coefficient vector in R^n which is approximately s-sparse. Moreover, the same convex program works for virtually all generalized linear models, in which the link function may be unknown. To our knowledge, these are the first results that tie together the theory of sparse logistic regression to 1-bit compressed sensing. Our results apply to general signal structures aside from sparsity; one only needs to know the size of the set K where signals reside. The size is given by the mean width of K, a computable quantity whose square serves as a robust extension of the dimension.
Cited in
(61)- Characterization of \(\ell_1\) minimizer in one-bit compressed sensing
- Endpoint results for Fourier integral operators on noncompact symmetric spaces
- Gradient projection Newton algorithm for sparse collaborative learning using synthetic and real datasets of applications
- Covariance estimation under one-bit quantization
- Fast binary embeddings with Gaussian circulant matrices: improved bounds
- The recovery of ridge functions on the hypercube suffers from the curse of dimensionality
- One-bit compressed sensing with non-Gaussian measurements
- Non-Gaussian hyperplane tessellations and robust one-bit compressed sensing
- One-bit sensing, discrepancy and Stolarsky's principle
- Least squares estimation in the monotone single index model
- One-bit compressed sensing via \(\ell_p\) \((0<p<1)\)-minimization method
- Double fused Lasso regularized regression with both matrix and vector valued predictors
- Robust one-bit compressed sensing with partial circulant matrices
- Sparse classification: a scalable discrete optimization perspective
- Just least squares: binary compressive sampling with low generative intrinsic dimension
- scientific article; zbMATH DE number 7306893 (Why is no real title available?)
- Gelfand numbers related to structured sparsity and Besov space embeddings with small mixed smoothness
- Quantization and compressive sensing
- Sigma delta quantization with harmonic frames and partial Fourier ensembles
- Hypothesis testing for high-dimensional sparse binary regression
- On recovery guarantees for one-bit compressed sensing on manifolds
- scientific article; zbMATH DE number 7415078 (Why is no real title available?)
- A unified approach to uniform signal recovery from nonlinear observations
- An extended Newton-type algorithm for \(\ell_2\)-regularized sparse logistic regression and its efficiency for classifying large-scale datasets
- On the convergence rate of projected gradient descent for a back-projection based objective
- Fast and reliable parameter estimation from nonlinear observations
- A one-bit, comparison-based gradient estimator
- Statistical Inference for High-Dimensional Generalized Linear Models With Binary Outcomes
- Uniform recovery guarantees for quantized corrupted sensing using structured or generative priors
- Variable smoothing incremental aggregated gradient method for nonsmooth nonconvex regularized optimization
- An introduction to compressed sensing
- Estimation from nonlinear observations via convex programming with application to bilinear regression
- Generalized high-dimensional trace regression via nuclear norm regularization
- On the atomic decomposition of coorbit spaces with non-integrable kernel
- Phase retrieval by binary questions: which complementary subspace is closer?
- Convergence guarantee for the sparse monotone single index model
- Noisy 1-bit compressive sensing: models and algorithms
- Quantized compressed sensing: a survey
- \(\ell^1\)-analysis minimization and generalized (co-)sparsity: when does recovery succeed?
- Generalizing CoSaMP to signals from a union of low dimensional linear subspaces
- An approach to one-bit compressed sensing based on probably approximately correct learning theory
- Robust Decoding from 1-Bit Compressive Sampling with Ordinary and Regularized Least Squares
- Representation and coding of signal geometry
- A theory of capacity and sparse neural encoding
- Structure from randomness in halfspace learning with the zero-one loss
- The landscape of empirical risk for nonconvex losses
- Time for dithering: fast and quantized random embeddings via the restricted isometry property
- Sparse recovery from saturated measurements
- Sigma Delta Quantization for Images
- Global and Simultaneous Hypothesis Testing for High-Dimensional Logistic Regression Models
- Estimation in high dimensions: a geometric perspective
- Linear regression with sparsely permuted data
- High-dimensional estimation with geometric constraints
- Flavors of compressive sensing
- Applied harmonic analysis and data processing. Abstracts from the workshop held March 25--31, 2018
- Classification scheme for binary data with extensions
- Simple classification using binary data
- A simple tool for bounding the deviation of random matrices on geometric sets
- Classification of COVID19 Patients using robust logistic regression
- Sparse learning for large-scale and high-dimensional data: a randomized convex-concave optimization approach
- AdaBoost and robust one-bit compressed sensing
This page was built for publication: Robust 1-bit Compressed Sensing and Sparse Logistic Regression: A Convex Programming Approach
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2989476)