Central limit theorems for classical multidimensional scaling
From MaRDI portal
Publication:2192306
Abstract: Classical multidimensional scaling is a widely used method in dimensionality reduction and manifold learning. The method takes in a dissimilarity matrix and outputs a low-dimensional configuration matrix based on a spectral decomposition. In this paper, we present three noise models and analyze the resulting configuration matrices, or embeddings. In particular, we show that under each of the three noise models the resulting embedding gives rise to a central limit theorem. We also provide compelling simulations and real data illustrations of these central limit theorems. This perturbation analysis represents a significant advancement over previous results regarding classical multidimensional scaling behavior under randomness.
Recommendations
- An analysis of classical multidimensional scaling with applications to clustering
- Taking all positive eigenvectors is suboptimal in classical multidimensional scaling
- Multidimensional scaling on metric measure spaces
- Multidimensional scaling of noisy high dimensional data
- Perturbation bounds for procrustes, classical scaling, and trilateration, with applications to manifold learning
Cites work
- scientific article; zbMATH DE number 3642533 (Why is no real title available?)
- scientific article; zbMATH DE number 5286897 (Why is no real title available?)
- scientific article; zbMATH DE number 49702 (Why is no real title available?)
- scientific article; zbMATH DE number 3806761 (Why is no real title available?)
- A class of invariant consistent tests for multivariate normality
- A limit theorem for scaled eigenvectors of random dot product graphs
- A remark on global positioning from local distances
- A useful variant of the Davis-Kahan theorem for statisticians
- Analysis of individual differences in multidimensional scaling via an \(n\)-way generalization of ``Eckart-Young decomposition
- Automatic dimensionality selection from the scree plot via the use of profile likelihood
- Diffusion maps
- Distance shrinkage and Euclidean embedding via regularized kernel estimation
- Energy statistics: a class of statistics based on distances
- Exact Reconstruction of Euclidean Distance Geometry Problem Using Low-Rank Matrix Completion
- High-dimensional probability. An introduction with applications in data science
- Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
- Large deformation diffeomorphic metric curve mapping
- Local multidimensional scaling for nonlinear dimension reduction, graph drawing, and proximity analysis
- Localization from incomplete noisy distance measurements
- Matrix estimation by universal singular value thresholding
- Measures of multivariate skewness and kurtosis with applications
- Modern multidimensional scaling. Theory and applications.
- Multidimensional scaling.
- Multidimensional scaling. I: Theory and method
- Single subject incomplete designs for nonmetric multidimensional scaling
- Solving Euclidean distance matrix completion problems via semidefinite progrmming
- The Dissimilarity Representation for Pattern Recognition
- The Euclidian Distance Matrix Completion Problem
- The Rotation of Eigenvectors by a Perturbation. III
- User-friendly tail bounds for sums of random matrices
Cited in
(6)- A dual basis approach to multidimensional scaling
- An analysis of classical multidimensional scaling with applications to clustering
- Perturbation bounds for procrustes, classical scaling, and trilateration, with applications to manifold learning
- Multi-dimensional scaling from \(K\)-nearest neighbourhood distances
- The out-of-sample problem for classical multidimensional scaling
- Singular vector distribution of sample covariance matrices
This page was built for publication: Central limit theorems for classical multidimensional scaling
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2192306)