One-Bit Compressed Sensing by Linear Programming
From MaRDI portal
Publication:2841676
DOI10.1002/cpa.21442zbMath1335.94018arXiv1109.4299OpenAlexW2964003909MaRDI QIDQ2841676
Publication date: 26 July 2013
Published in: Communications on Pure and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1109.4299
Linear programming (90C05) Signal theory (characterization, reconstruction, filtering, etc.) (94A12)
Related Items
Error bounds for consistent reconstruction: random polytopes and coverage processes, Sparse recovery from inaccurate saturated measurements, Estimation of block sparsity in compressive sensing, Sigma delta quantization with harmonic frames and partial Fourier ensembles, Robust Decoding from 1-Bit Compressive Sampling with Ordinary and Regularized Least Squares, A unified framework for linear dimensionality reduction in L1, Quantization and Compressive Sensing, 1-bit compressive sensing: reformulation and RRSP-based sign recovery theory, Phase retrieval by binary questions: which complementary subspace is closer?, Robust recovery of low-rank matrices with non-orthogonal sparse decomposition from incomplete measurements, Quantization of compressive samples with stable and robust recovery, Quantization-aware phase retrieval, Robust sensing of low-rank matrices with non-orthogonal sparse decomposition, A unified approach to uniform signal recovery from nonlinear observations, Robust one-bit compressed sensing with partial circulant matrices, A simple homotopy proximal mapping algorithm for compressive sensing, Compressive phase retrieval: Optimal sample complexity with deep generative priors, UNIFORM-IN-SUBMODEL BOUNDS FOR LINEAR REGRESSION IN A MODEL-FREE FRAMEWORK, Noisy 1-bit compressive sensing: models and algorithms, Just least squares: binary compressive sampling with low generative intrinsic dimension, Sparse probit linear mixed model, Performance bounds of the intensity-based estimators for noisy phase retrieval, Sparse recovery from saturated measurements, Representation and coding of signal geometry, High-dimensional estimation with geometric constraints: Table 1., One-bit sensing, discrepancy and Stolarsky's principle, Flavors of Compressive Sensing, One-bit compressed sensing with non-Gaussian measurements, Quantized Compressed Sensing: A Survey, Classification Scheme for Binary Data with Extensions, Real-valued embeddings and sketches for fast distance and similarity estimation, The landscape of empirical risk for nonconvex losses, Dimension reduction by random hyperplane tessellations, On recovery guarantees for one-bit compressed sensing on manifolds, The stochastic geometry of unconstrained one-bit data compression, Gelfand numbers related to structured sparsity and Besov space embeddings with small mixed smoothness, Estimation in High Dimensions: A Geometric Perspective, Fast and RIP-optimal transforms, Non-Gaussian hyperplane tessellations and robust one-bit compressed sensing, On the \(\ell^\infty\)-norms of the singular vectors of arbitrary powers of a difference matrix with applications to sigma-delta quantization, Simple Classification using Binary Data, Thin-shell concentration for zero cells of stationary Poisson mosaics, Memoryless scalar quantization for random frames, Endpoint Results for Fourier Integral Operators on Noncompact Symmetric Spaces, On the Atomic Decomposition of Coorbit Spaces with Non-integrable Kernel, Sparse Learning for Large-Scale and High-Dimensional Data: A Randomized Convex-Concave Optimization Approach, Sparse classification: a scalable discrete optimization perspective, Characterization of ℓ1 minimizer in one-bit compressed sensing, Generalized high-dimensional trace regression via nuclear norm regularization, On the asymptotic variance of the debiased Lasso, High-dimensional model recovery from random sketched data by exploring intrinsic sparsity, Unnamed Item, One-bit compressed sensing via ℓ p (0 < p < 1)-minimization method, Quantized compressed sensing for random circulant matrices, Hypothesis testing for high-dimensional sparse binary regression, Unnamed Item, Unnamed Item, AdaBoost and robust one-bit compressed sensing, Adaptive iterative hard thresholding for least absolute deviation problems with sparsity constraints, Covariance estimation under one-bit quantization, Iteratively consistent one-bit phase retrieval
Cites Work
- Unnamed Item
- Unnamed Item
- Democracy in action: quantization, saturation, and compressive sensing
- Stability and instance optimality for Gaussian measurements in compressed sensing
- Simultaneous analysis of Lasso and Dantzig selector
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Robust 1-Bit Compressive Sensing via Binary Stable Embeddings of Sparse Vectors
- Threshold Group Testing
- Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
- Improved approximation algorithms for maximum cut and satisfiability problems using semidefinite programming
- Trust, But Verify: Fast and Accurate Signal Recovery From 1-Bit Compressive Measurements
- Dequantizing Compressed Sensing: When Oversampling and Non-Gaussian Constraints Combine
- Stable signal recovery from incomplete and inaccurate measurements