Robust 1-bit Compressed Sensing and Sparse Logistic Regression: A Convex Programming Approach

From MaRDI portal
Revision as of 21:06, 3 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:2989476

DOI10.1109/TIT.2012.2207945zbMath1364.94153arXiv1202.1212OpenAlexW2964322027MaRDI QIDQ2989476

R. V. Vershinin, Yaniv Plan

Publication date: 8 June 2017

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1202.1212




Related Items (60)

A Simple Tool for Bounding the Deviation of Random Matrices on Geometric SetsSigma delta quantization with harmonic frames and partial Fourier ensemblesRobust Decoding from 1-Bit Compressive Sampling with Ordinary and Regularized Least SquaresQuantization and Compressive SensingA one-bit, comparison-based gradient estimatorPhase retrieval by binary questions: which complementary subspace is closer?Convergence guarantee for the sparse monotone single index model\(\ell^1\)-analysis minimization and generalized (co-)sparsity: when does recovery succeed?Generalizing CoSaMP to signals from a union of low dimensional linear subspacesFast and Reliable Parameter Estimation from Nonlinear ObservationsSigma Delta Quantization for ImagesA theory of capacity and sparse neural encodingA unified approach to uniform signal recovery from nonlinear observationsRobust one-bit compressed sensing with partial circulant matricesStatistical Inference for High-Dimensional Generalized Linear Models With Binary OutcomesNoisy 1-bit compressive sensing: models and algorithmsJust least squares: binary compressive sampling with low generative intrinsic dimensionSparse recovery from saturated measurementsRepresentation and coding of signal geometryTime for dithering: fast and quantized random embeddings via the restricted isometry propertyHigh-dimensional estimation with geometric constraints: Table 1.One-bit sensing, discrepancy and Stolarsky's principleFlavors of Compressive SensingOne-bit compressed sensing with non-Gaussian measurementsLinear regression with sparsely permuted dataStructure from Randomness in Halfspace Learning with the Zero-One LossAn Introduction to Compressed SensingQuantized Compressed Sensing: A SurveyClassification Scheme for Binary Data with ExtensionsApplied harmonic analysis and data processing. Abstracts from the workshop held March 25--31, 2018Fast binary embeddings with Gaussian circulant matrices: improved boundsThe landscape of empirical risk for nonconvex lossesThe recovery of ridge functions on the hypercube suffers from the curse of dimensionalityEstimation from nonlinear observations via convex programming with application to bilinear regressionOn recovery guarantees for one-bit compressed sensing on manifoldsGelfand numbers related to structured sparsity and Besov space embeddings with small mixed smoothnessAn extended Newton-type algorithm for \(\ell_2\)-regularized sparse logistic regression and its efficiency for classifying large-scale datasetsEstimation in High Dimensions: A Geometric PerspectiveNon-Gaussian hyperplane tessellations and robust one-bit compressed sensingDouble fused Lasso regularized regression with both matrix and vector valued predictorsVariable smoothing incremental aggregated gradient method for nonsmooth nonconvex regularized optimizationSimple Classification using Binary DataEndpoint Results for Fourier Integral Operators on Noncompact Symmetric SpacesOn the Atomic Decomposition of Coorbit Spaces with Non-integrable KernelSparse Learning for Large-Scale and High-Dimensional Data: A Randomized Convex-Concave Optimization ApproachSparse classification: a scalable discrete optimization perspectiveCharacterization of ℓ1 minimizer in one-bit compressed sensingGeneralized high-dimensional trace regression via nuclear norm regularizationLeast squares estimation in the monotone single index modelGlobal and Simultaneous Hypothesis Testing for High-Dimensional Logistic Regression ModelsUnnamed ItemOne-bit compressed sensing via p (0<p <1)-minimization methodHypothesis testing for high-dimensional sparse binary regressionUnnamed ItemUnnamed ItemClassification of COVID19 Patients using robust logistic regressionOn the Convergence Rate of Projected Gradient Descent for a Back-Projection Based ObjectiveAdaBoost and robust one-bit compressed sensingGradient projection Newton algorithm for sparse collaborative learning using synthetic and real datasets of applicationsCovariance estimation under one-bit quantization







This page was built for publication: Robust 1-bit Compressed Sensing and Sparse Logistic Regression: A Convex Programming Approach