Extensions of Lipschitz mappings into a Hilbert space

From MaRDI portal
Publication:3326256

DOI10.1090/conm/026/737400zbMath0539.46017OpenAlexW2979473749WikidataQ57253306 ScholiaQ57253306MaRDI QIDQ3326256

William B. Johnson, Joram Lindenstrauss

Publication date: 1984

Published in: Conference on Modern Analysis and Probability (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1090/conm/026/737400



Related Items

A Simple Tool for Bounding the Deviation of Random Matrices on Geometric Sets, Johnson–Lindenstrauss Embeddings with Kronecker Structure, Random projection preserves stability with high probability, Randomized numerical linear algebra: Foundations and algorithms, Randomized QR with Column Pivoting, Lower bounds on the low-distortion embedding dimension of submanifolds of \(\mathbb{R}^n\), PLSS: A Projected Linear Systems Solver, Algebraic compressed sensing, Tighter guarantees for the compressive multi-layer perceptron, Around the log-rank conjecture, Towards Practical Large-Scale Randomized Iterative Least Squares Solvers through Uncertainty Quantification, An Outer-Product-of-Gradient Approach to Dimension Reduction and its Application to Classification in High Dimensional Space, Random Projection Ensemble Classification with High-Dimensional Time Series, M-IHS: an accelerated randomized preconditioning method avoiding costly matrix decompositions, A hybrid stochastic interpolation and compression method for kernel matrices, Proximinality and uniformly approximable sets in \(L^p\), Secure sampling with sublinear communication, Neural ODE Control for Classification, Approximation, and Transport, \(\mathrm{MIP}^* = \mathrm{RE}\): a negative resolution to Connes' embedding problem and Tsirelson's problem, Labelings vs. embeddings: on distributed and prioritized representations of distances, \( \varepsilon \)-isometric dimension reduction for incompressible subsets of \(\ell_p\), Minimum cost flow in the CONGEST model, On fast Johnson-Lindenstrauss embeddings of compact submanifolds of \(\mathbb{R}^N\) with boundary, Fast Metric Embedding into the Hamming Cube, Neural manifold analysis of brain circuit dynamics in health and disease, Unnamed Item, Correlation-based sparse inverse Cholesky factorization for fast Gaussian-process inference, Random Projection and Recovery for High Dimensional Optimization with Arbitrary Outliers, Large deviation principles induced by the Stiefel manifold, and random multidimensional projections, Statistical embedding: beyond principal components, Distributed estimation and inference for spatial autoregression model with large scale networks, Geometric bounds on the fastest mixing Markov chain, On the distance to low-rank matrices in the maximum norm, Euclidean distortion and the sparsest cut, Unnamed Item, Simple Analyses of the Sparse Johnson-Lindenstrauss Transform., Visual Categorization with Random Projection, Advances in metric embedding theory, Approximation of the Diagonal of a Laplacian’s Pseudoinverse for Complex Network Analysis, Approximate F_2-Sketching of Valuation Functions, Log-Lipschitz embeddings of homogeneous sets with sharp logarithmic exponents and slicing products of balls, Comparison of Metric Spectral Gaps, Testing the manifold hypothesis, On dominated \(\ell_1\) metrics, Speeding-Up Lattice Reduction with Random Projections (Extended Abstract), Approximate minimum enclosing balls in high dimensions using core-sets, Some applications of Ball’s extension theorem, Flip-flop spectrum-revealing QR factorization and its applications to singular value decomposition, Local embeddings of metric spaces, Random Projections for Linear Programming, On the Atomic Decomposition of Coorbit Spaces with Non-integrable Kernel, Streaming Low-Rank Matrix Approximation with an Application to Scientific Simulation, A Simple Proof of the Johnson–Lindenstrauss Extension Theorem, On the Impossibility of Dimension Reduction for Doubling Subsets of $\ell_{p}$, Nonadditivity of Rényi entropy and Dvoretzky’s theorem, Beta Random Projection, The legacy of Jean Bourgain in geometric functional analysis, Semidefinite Programming Based Preconditioning for More Robust Near-Separable Nonnegative Matrix Factorization, Sampling, Metric Entropy, and Dimensionality Reduction, Euclidean arrangements in Banach spaces, Lower Memory Oblivious (Tensor) Subspace Embeddings with Fewer Random Bits: Modewise Methods for Least Squares, Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, Fast Cluster Tendency Assessment for Big, High-Dimensional Data, An accelerated randomized Kaczmarz method via low-rank approximation, Sketching and Embedding are Equivalent for Norms, Deterministic Truncation of Linear Matroids, New Analysis on Sparse Solutions to Random Standard Quadratic Optimization Problems and Extensions, Manifold Learning and Nonlinear Homogenization, On Flattenability of Graphs, Signal Recovery and System Calibration from Multiple Compressive Poisson Measurements, On the Complexity of Closest Pair via Polar-Pair of Point-Sets, Unnamed Item, Fast Phase Retrieval from Local Correlation Measurements, Streaming Embeddings with Slack, A Survey of Compressed Sensing, Quantization and Compressive Sensing, Compressive Gaussian Mixture Estimation, Performance of Johnson--Lindenstrauss Transform for $k$-Means and $k$-Medians Clustering, Randomized Complete Pivoting for Solving Symmetric Indefinite Linear Systems, Sparser Johnson-Lindenstrauss Transforms, On Type of Metric Spaces, Metric Embedding via Shortest Path Decompositions, A Measure Concentration Effect for Matrices of High, Higher, and Even Higher Dimension, Applications of Deviation Inequalities on Finite Metric Sets, Optimal (Euclidean) Metric Compression, Convexification with Bounded Gap for Randomly Projected Quadratic Optimization, On Fiber Diameters of Continuous Maps, Lossless Prioritized Embeddings, Prioritized Metric Structures and Embedding, Nonlocal Games with Noisy Maximally Entangled States are Decidable, RidgeSketch: A Fast Sketching Based Solver for Large Scale Ridge Regression, Unnamed Item, Expander graphs and their applications, Representation and coding of signal geometry, Literature survey on low rank approximation of matrices, Bounds on Dimension Reduction in the Nuclear Norm, Randomized Sampling for Basis Function Construction in Generalized Finite Element Methods, A Sparse Random Projection-Based Test for Overall Qualitative Treatment Effects, On variants of the Johnson–Lindenstrauss lemma, Sign rank versus Vapnik-Chervonenkis dimension, Randomized Projection for Rank-Revealing Matrix Factorizations and Low-Rank Approximations, Approximation Algorithms for CSPs, Structure from Randomness in Halfspace Learning with the Zero-One Loss, Кросс-энтропийная редукции матрицы данных с ограничением информационной емкости матриц-проекторов и их норм, Randomized Residual-Based Error Estimators for Parametrized Equations, An Introduction to Compressed Sensing, Classification Scheme for Binary Data with Extensions, BiLipschitz embeddings of spheres into jet space Carnot groups not admitting Lipschitz extensions, Structured Random Sketching for PDE Inverse Problems, Statistical mechanics of complex neural systems and high dimensional data, Spectral calculus and Lipschitz extension for barycentric metric spaces, Learning Complexity vs Communication Complexity, Blessing of dimensionality: mathematical foundations of the statistical physics of data, A Practical Randomized CP Tensor Decomposition, Low Distortion Metric Embedding into Constant Dimension, Limitations on Quantum Dimensionality Reduction, MULTIVARIATE CALIBRATION WITH SUPPORT VECTOR REGRESSION BASED ON RANDOM PROJECTION, On b-bit min-wise hashing for large-scale regression and classification with sparse data, Unnamed Item, Unnamed Item, Random Projection RBF Nets for Multidimensional Density Estimation, Dimension reduction for hyperbolic space, Intrinsic Complexity and Scaling Laws: From Random Fields to Random Vectors, Dimension Reduction for Polynomials over Gaussian Space and Applications, Inapproximability for metric embeddings into $\mathbb{R}^{d}$, Fast Parallel Estimation of High Dimensional Information Theoretical Quantities with Low Dimensional Random Projection Ensembles, Randomized Dynamic Mode Decomposition, Stochastic Algorithms in Linear Algebra - beyond the Markov Chains and von Neumann - Ulam Scheme, Unnamed Item, On Using Toeplitz and Circulant Matrices for Johnson-Lindenstrauss Transforms, Book Review: Metric embeddings: bilipschitz and coarse embedddings into Banach spaces, Unnamed Item, Simple Classification using Binary Data, Optimal Bounds for Johnson-Lindenstrauss Transformations, A survey on unsupervised outlier detection in high‐dimensional numerical data, Boosted sparse nonlinear distance metric learning, Too Acute to Be True: The Story of Acute Sets, Almost Optimal Explicit Johnson-Lindenstrauss Families, Coresets for Fuzzy K-Means with Applications, On Closest Pair in Euclidean Metric: Monochromatic is as Hard as Bichromatic, Johnson-Lindenstrauss lemma for circulant matrices**, Six mathematical gems from the history of distance geometry, Optimality of linear sketching under modular updates, Sparse Sensor Placement Optimization for Classification, IMPROVED ANALYSIS OF THE SUBSAMPLED RANDOMIZED HADAMARD TRANSFORM, Change-Point Detection of the Mean Vector with Fewer Observations than the Dimension Using Instantaneous Normal Random Projections, Information Preserving Dimensionality Reduction, Near Isometric Terminal Embeddings for Doubling Metrics, On the Complexity of Closest Pair via Polar-Pair of Point-Sets, Estimating Leverage Scores via Rank Revealing Methods and Randomization, Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, Characterization of lung tumor subtypes through gene expression cluster validity assessment, Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, Theory and applications of compressed sensing, Tensor-Structured Sketching for Constrained Least Squares, Approximate Nearest Neighbors Search Without False Negatives For l_2 For c>sqrt{loglog{n}}., Random projections and Hotelling’s T2 statistics for change detection in high-dimensional data streams, Why Are Big Data Matrices Approximately Low Rank?, A refined approximation for Euclidean \(k\)-means, A scheme for distributed compressed video sensing based on hypothesis set optimization techniques, Pass-efficient methods for compression of high-dimensional turbulent flow data, Derandomized compressed sensing with nonuniform guarantees for \(\ell_1\) recovery, Binary vectors for fast distance and similarity estimation, Random embeddings with an almost Gaussian distortion, Random-walk based approximate \(k\)-nearest neighbors algorithm for diffusion state distance, Lower bounds on lattice sieving and information set decoding, High-dimensional clustering via random projections, Side-constrained minimum sum-of-squares clustering: mathematical programming and random projections, Spectral estimation from simulations via sketching, Efficient binary embedding of categorical data using BinSketch, Metric extension operators, vertex sparsifiers and Lipschitz extendability, The geometry of graphs and some of its algorithmic applications, Approximation and inapproximability results for maximum clique of disc graphs in high dimensions, The Johnson-Lindenstrauss lemma almost characterizes Hilbert space, but not quite, Numerical bifurcation analysis of PDEs from lattice Boltzmann model simulations: a parsimonious machine learning approach, Infinite lattice learner: an ensemble for incremental learning, Covariance matrix testing in high dimension using random projections, Perturbations of the \textsc{Tcur} decomposition for tensor valued data in the Tucker format, Far-field compression for fast kernel summation methods in high dimensions, The perfect marriage and much more: combining dimension reduction, distance measures and covariance, On the distortion required for embedding finite metric spaces into normed spaces, Multiscale geometric methods for data sets. I: Multiscale SVD, noise and curvature., Practical non-interactive publicly verifiable secret sharing with thousands of parties, Randomization and entropy in machine learning and data processing, Entropy-randomized projection, A survey of outlier detection in high dimensional data streams, Terminal embeddings, Near isometric terminal embeddings for doubling metrics, Optimal stable nonlinear approximation, Guarantees for the Kronecker fast Johnson-Lindenstrauss transform using a coherence and sampling argument, Binary random projections with controllable sparsity patterns, Estimation of Wasserstein distances in the spiked transport model, A simple test for zero multiple correlation coefficient in high-dimensional normal data using random projection, An introduction to the Ribe program, Distance geometry and data science, Estimates on the Markov convexity of Carnot groups and quantitative nonembeddability, Absolutely minimal Lipschitz extension of tree-valued mappings, Variance estimates and almost Euclidean structure, A simple homotopy proximal mapping algorithm for compressive sensing, Essay on Kashin's remarkable 1977 decomposition theorem, Adaptive iterative Hessian sketch via \(A\)-optimal subsampling, An efficient superpostional quantum Johnson-Lindenstrauss lemma via unitary \(t\)-designs, Random projections for quadratic programs, Side effects of learning from low-dimensional data embedded in a Euclidean space, Random projections of linear and semidefinite problems with linear inequalities, It ain't where you're from, it's where you're at: hiring origins, firm heterogeneity, and wages, On deterministic sketching and streaming for sparse recovery and norm estimation, One-trial correction of legacy AI systems and stochastic separation theorems, Ensemble clustering using semidefinite programming with applications, A weighted multiple classifier framework based on random projection, Linear dimension reduction approximately preserving a function of the $1$-norm, On closest pair in Euclidean metric: monochromatic is as hard as bichromatic, An \(O(N \log N)\) hierarchical random compression method for kernel matrices by sampling partial matrix entries, On embedding trees into uniformly convex Banach spaces, A central limit theorem for convex sets, Learning intersections of halfspaces with a margin, Using projection-based clustering to find distance- and density-based clusters in high-dimensional data, Impossibility of almost extension, On using Toeplitz and circulant matrices for Johnson-Lindenstrauss transforms, Dimension reduction by random hyperplane tessellations, Convergence rates of learning algorithms by random projection, Randomized large distortion dimension reduction, A randomized method for solving discrete ill-posed problems, An algorithmic theory of learning: Robust concepts and random projection, Markov chains in smooth Banach spaces and Gromov-hyperbolic metric spaces, GLDH: toward more efficient global low-density locality-sensitive hashing for high dimensions, Learning in compressed space, Noise-Shaping Quantization Methods for Frame-Based and Compressive Sampling Systems, Geometric component analysis and its applications to data analysis, Random projections for conic programs, Impossibility of dimension reduction in the nuclear norm, Stochastic boundary methods of fundamental solutions for solving PDEs, An average John theorem, No dimension reduction for doubling subsets of \(\ell_q\) when \(q>2\) revisited, Projected tests for high-dimensional covariance matrices, Learning the truth vector in high dimensions, Robust \(k\)-means clustering for distributions with two moments, Random projection-based auxiliary information can improve tree-based nearest neighbor search, Testing and estimating change-points in the covariance matrix of a high-dimensional time series, Bayesian random projection-based signal detection for Gaussian scale space random fields, Optimal fast Johnson-Lindenstrauss embeddings for large data sets, Compressed dictionary learning, Dimensionality reduction for \(k\)-distance applied to persistent homology, A space with no unconditional basis that satisfies the Johnson-Lindenstrauss lemma, On the strong restricted isometry property of Bernoulli random matrices, An asymptotic thin shell condition and large deviations for random multidimensional projections, Affine quermassintegrals of random polytopes, High-dimensional outlier detection using random projections, Dimension reduction in recurrent networks by canonicalization, Empirical processes and random projections, Metric structures in \(L_1\): dimension, snowflakes, and average distortion, Fast and memory-optimal dimension reduction using Kac's walk, Monotone maps, sphericity and bounded second eigenvalue, Manifold reconstruction and denoising from scattered data in high dimension, Variance reduction in feature hashing using MLE and control variate method, On the error bound in a combinatorial central limit theorem, On the perceptron's compression, On Lipschitz extension from finite subsets, On principal components regression, random projections, and column subsampling, Robust vertex enumeration for convex hulls in high dimensions, Matrix sketching for supervised classification with imbalanced classes, Banach spaces with a weak cotype 2 property, Remarks on the geometry of coordinate projections in \(\mathbb{R}^n\), Dimension reduction and construction of feature space for image pattern recognition, Loda: lightweight on-line detector of anomalies, Gaussian random projections for Euclidean membership problems, A unified framework for linear dimensionality reduction in L1, Compressed sensing and dynamic mode decomposition, An algorithmic theory of learning: robust concepts and random projection, Kernels as features: on kernels, margins, and low-dimensional mappings, A Gaussian small deviation inequality for convex functions, Metric embedding, hyperbolic space, and social networks, On Dvoretzky's theorem for subspaces of \(L_p\), On Lipschitz embedding of finite metric spaces in Hilbert space, Randomized nonlinear projections uncover high-dimensional structure, On embedding expanders into \(\ell_p\) spaces, Efficient clustering on Riemannian manifolds: a kernelised random projection approach, The Johnson-Lindenstrauss lemma and the sphericity of some graphs, Randomized projective methods for the construction of binary sparse vector representations, Isometries and additive mapping on the unit spheres of normed spaces, Dimension reduction for finite trees in \(\ell_1\), A nonlinear approach to dimension reduction, Fast density-weighted low-rank approximation spectral clustering, Randomized anisotropic transform for nonlinear dimensionality reduction, Sparsified randomization algorithms for low rank approximations and applications to integral equations and inhomogeneous random field simulation, Database-friendly random projections: Johnson-Lindenstrauss with binary coins., Compression bounds for Lipschitz maps from the Heisenberg group to \(L_{1}\), Algorithmic paradigms for stability-based cluster validity and model selection statistical methods, with applications to microarray data analysis, A variant of the Johnson-Lindenstrauss lemma for circulant matrices, Dense fast random projections and Lean Walsh transforms, Shape classification by manifold learning in multiple observation spaces, More on the duality conjecture for entropy numbers., Dimensional reduction in vector space methods for natural language processing: products and projections, Explicit constructions of RIP matrices and related problems, Robustness properties of dimensionality reduction with Gaussian random matrices, Compressed labeling on distilled labelsets for multi-label learning, A tree-based regressor that adapts to intrinsic dimension, Acceleration of randomized Kaczmarz method via the Johnson-Lindenstrauss lemma, Problems and results in extremal combinatorics. I., Approximate nearest neighbor search for \(\ell_{p}\)-spaces \((2 < p < \infty)\) via embeddings, Indexability, concentration, and VC theory, Low dimensional embeddings of ultrametrics., Entropy dimension reduction method for randomized machine learning problems, An efficient algorithm for maximal margin clustering, Large dimensional sets not containing a given angle, Analysis of agglomerative clustering, Obituary: On the mathematical contributions of Joram Lindenstrauss, Near-optimal encoding for sigma-delta quantization of finite frame expansions, Deterministic parallel algorithms for bilinear objective functions, Improved linear embeddings via Lagrange duality, Toward a unified theory of sparse dimensionality reduction in Euclidean space, Forecasting using random subspace methods, Bayesian compressed vector autoregressions, Scalable density-based clustering with quality guarantees using random projections, Graph summarization with quality guarantees, Learning the geometry of common latent variables using alternating-diffusion, Extensions of vector-valued functions with preservation of derivatives, Real-valued embeddings and sketches for fast distance and similarity estimation, Solving LP using random projections, Stable recovery of low-dimensional cones in Hilbert spaces: one RIP to rule them all, Dimensionality reduction with subgaussian matrices: a unified theory, Human activity recognition in AAL environments using random projections, Markov chains, Riesz transforms and Lipschitz maps, Fast, linear time, \(m\)-adic hierarchical clustering for search and retrieval using the Baire metric, with linkages to generalized ultrametrics, hashing, formal concept analysis, and precision of data measurement, A performance driven methodology for cancelable face templates generation, Randomization of data acquisition and \(\ell_{1}\)-optimization (recognition with compression), Extending Lipschitz functions via random metric partitions, The Mailman algorithm: a note on matrix-vector multiplication, Almost isometries and orthogonality, Swarm intelligence for self-organized clustering, Learning mixtures of separated nonspherical Gaussians, Statistical methods for tissue array images -- algorithmic scoring and co-training, A comparison principle for functions of a uniformly random subspace, Extensions of Lipschitz maps into Banach spaces, Randomized interpolative decomposition of separated representations, Deep network based on stacked orthogonal convex incremental ELM autoencoders, Fuzzy \(c\)-means and cluster ensemble with random projection for big data clustering, R3P-Loc: a compact multi-label predictor using ridge regression and random projection for protein subcellular localization, Finite metric spaces needing high dimension for Lipschitz embeddings in Banach spaces, Sketching information divergences, Isometric embedding in \(\ell_ p\)-spaces, Volume distortion for subsets of Euclidean spaces, Lower bounds for local versions of dimension reductions, A simple proof of the restricted isometry property for random matrices, Rigorous restricted isometry property of low-dimensional subspaces, Fast dimension reduction using Rademacher series on dual BCH codes, High-dimensional model recovery from random sketched data by exploring intrinsic sparsity, Latent semantic indexing: A probabilistic analysis, On computing the diameter of a point set in high dimensional Euclidean space., Some geometric applications of the beta distribution, Proportional concentration phenomena on the sphere, On approximating planar metrics by tree metrics., On approximate nearest neighbors under \(l_\infty\) norm, Population recovery and partial identification, Similarity, kernels, and the fundamental constraints on cognition, Sparse control of alignment models in high dimension, Two observations regarding embedding subsets of Euclidean spaces in normed spaces, On the smallest possible dimension and the largest possible margin of linear arrangements representing given concept classes