New and Improved Johnson–Lindenstrauss Embeddings via the Restricted Isometry Property

From MaRDI portal
Publication:3097486

DOI10.1137/100810447zbMath1247.15019arXiv1009.0744OpenAlexW2963262327MaRDI QIDQ3097486

Rachel Ward, Felix Krahmer

Publication date: 10 November 2011

Published in: SIAM Journal on Mathematical Analysis (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1009.0744




Related Items

Convergence and stability of iteratively reweighted least squares for low-rank matrix recoveryOn principal components regression, random projections, and column subsamplingJohnson–Lindenstrauss Embeddings with Kronecker StructureRandomized numerical linear algebra: Foundations and algorithmsCompressed data separation via unconstrained l1-split analysisSpectral estimation from simulations via sketchingDerandomizing restricted isometries via the Legendre symbolTighter Fourier Transform Lower BoundsStructure dependent sampling in compressed sensing: theoretical guarantees for tight framesImproved bounds for sparse recovery from subsampled random convolutionsA Novel Compressed Sensing Scheme for Photoacoustic TomographyA unified framework for linear dimensionality reduction in L1Compressive Sensing with Redundant Dictionaries and Structured MeasurementsFast Phase Retrieval from Local Correlation MeasurementsA Survey of Compressed SensingThe Quest for Optimal Sampling: Computationally Efficient, Structure-Exploiting Measurements for Compressed SensingQuantization and Compressive SensingSparse Model Uncertainties in Compressed Sensing with Application to Convolutions and Sporadic CommunicationPerformance of Johnson--Lindenstrauss Transform for $k$-Means and $k$-Medians ClusteringSparser Johnson-Lindenstrauss TransformsThe Hanson-Wright inequality for random tensorsUnnamed ItemSuprema of Chaos Processes and the Restricted Isometry PropertyLow rank tensor recovery via iterative hard thresholdingGeneralized notions of sparsity and restricted isometry property. II: ApplicationsOn randomized trace estimates for indefinite matrices with an application to determinantsIterative hard thresholding for compressed data separationFast deflation sparse principal component analysis via subspace projectionsRestricted isometries for partial random circulant matrices\( \varepsilon \)-isometric dimension reduction for incompressible subsets of \(\ell_p\)On fast Johnson-Lindenstrauss embeddings of compact submanifolds of \(\mathbb{R}^N\) with boundaryFast Metric Embedding into the Hamming CubeEnhanced total variation minimization for stable image reconstructionSide effects of learning from low-dimensional data embedded in a Euclidean spaceA variant of the Johnson-Lindenstrauss lemma for circulant matricesCompressed sensing of low-rank plus sparse matricesDimensionality-reduced subspace clusteringA strong restricted isometry property, with an application to phaseless compressed sensingRobustness properties of dimensionality reduction with Gaussian random matricesCompressive SensingGreedy-like algorithms for the cosparse analysis modelOn deterministic sketching and streaming for sparse recovery and norm estimationSimple Analyses of the Sparse Johnson-Lindenstrauss Transform.An investigation of Newton-Sketch and subsampled Newton methodsThe restricted isometry property for random block diagonal matricesSignal separation under coherent dictionaries and \(\ell_p\)-bounded noiseClassification Scheme for Binary Data with ExtensionsNear-optimal encoding for sigma-delta quantization of finite frame expansionsKernel conjugate gradient methods with random projectionsToward a unified theory of sparse dimensionality reduction in Euclidean spaceOn using Toeplitz and circulant matrices for Johnson-Lindenstrauss transformsReal-valued embeddings and sketches for fast distance and similarity estimationOn orthogonal projections for dimension reduction and applications in augmented target loss functions for learning problemsFast binary embeddings with Gaussian circulant matrices: improved boundsPersistent homology for low-complexity modelsConvergence of projected Landweber iteration for matrix rank minimizationSparse recovery with coherent tight frames via analysis Dantzig selector and analysis LASSOCompressed sensing with coherent and redundant dictionariesSparse recovery in bounded Riesz systems with applications to numerical methods for PDEsDictionary-sparse recovery via thresholding-based algorithmsFast and RIP-optimal transformsConvergence analysis of projected gradient descent for Schatten-\(p\) nonconvex matrix recoveryOn Using Toeplitz and Circulant Matrices for Johnson-Lindenstrauss TransformsSimple Classification using Binary DataMinimization of the difference of Nuclear and Frobenius norms for noisy low rank matrix recoveryOptimal fast Johnson-Lindenstrauss embeddings for large data setsCompressed dictionary learningDimensionality reduction for \(k\)-distance applied to persistent homologyOn the Atomic Decomposition of Coorbit Spaces with Non-integrable KernelSparse reconstruction with multiple Walsh matricesRendition: Reclaiming What a Black Box Takes AwayFast and memory-optimal dimension reduction using Kac's walkLower Memory Oblivious (Tensor) Subspace Embeddings with Fewer Random Bits: Modewise Methods for Least SquaresThe Restricted Isometry Property of Subsampled Fourier MatricesUnnamed ItemGenMod: a generative modeling approach for spectral representation of PDEs with random inputsUnnamed ItemTheory and applications of compressed sensing




This page was built for publication: New and Improved Johnson–Lindenstrauss Embeddings via the Restricted Isometry Property