Fast Estimation of Approximate Matrix Ranks Using Spectral Densities
From MaRDI portal
Publication:5380703
Abstract: In many machine learning and data related applications, it is required to have the knowledge of approximate ranks of large data matrices at hand. In this paper, we present two computationally inexpensive techniques to estimate the approximate ranks of such large matrices. These techniques exploit approximate spectral densities, popular in physics, which are probability density distributions that measure the likelihood of finding eigenvalues of the matrix at a given point on the real line. Integrating the spectral density over an interval gives the eigenvalue count of the matrix in that interval. Therefore the rank can be approximated by integrating the spectral density over a carefully selected interval. Two different approaches are discussed to estimate the approximate rank, one based on Chebyshev polynomials and the other based on the Lanczos algorithm. In order to obtain the appropriate interval, it is necessary to locate a gap between the eigenvalues that correspond to noise and the relevant eigenvalues that contribute to the matrix rank. A method for locating this gap and selecting the interval of integration is proposed based on the plot of the spectral density. Numerical experiments illustrate the performance of these techniques on matrices from typical applications.
Recommendations
- Randomized estimation of spectral densities of large matrices made accurate
- Approximating spectral densities of large matrices
- Estimating the Rank of the Spectral Density Matrix
- Fast gradient method for low-rank matrix estimation
- Estimation of high-dimensional low-rank matrices
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- Rank Detection Methods for Sparse Matrices
- scientific article; zbMATH DE number 7008333
- Fast computation of low rank matrix approximations
- A fast and efficient algorithm for low-rank approximation of a matrix
Cites work
- scientific article; zbMATH DE number 5957338 (Why is no real title available?)
- scientific article; zbMATH DE number 3179793 (Why is no real title available?)
- scientific article; zbMATH DE number 3671573 (Why is no real title available?)
- scientific article; zbMATH DE number 6159604 (Why is no real title available?)
- scientific article; zbMATH DE number 961607 (Why is no real title available?)
- scientific article; zbMATH DE number 3195586 (Why is no real title available?)
- A novel M-estimator for robust PCA
- A stochastic estimator of the trace of the influence matrix for laplacian smoothing splines
- Active subspace: toward scalable low-rank learning
- Analysis of Some Krylov Subspace Approximations to the Matrix Exponential Operator
- Approximating spectral densities of large matrices
- Approximation of step functions in problems of mathematical modeling
- Approximation theory and approximation practice
- Bi-cross-validation of the SVD and the nonnegative matrix factorization
- Dimension Selection for Feature Selection and Dimension Reduction with Principal and Independent Component Analysis
- Efficient estimation of eigenvalue counts in an interval.
- Electronic structure calculations for plane-wave codes without diagonalization
- Fast Monte Carlo Algorithms for Matrices II: Computing a Low-Rank Approximation to a Matrix
- Fast and Stable Subspace Tracking
- Filtered Conjugate Residual‐type Algorithms with Applications
- Finding structure with randomness: probabilistic algorithms for constructing approximate matrix decompositions
- Gaussian processes for machine learning.
- Generalized low rank approximations of matrices
- Improved bounds on sample size for implicit matrix trace estimators
- Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
- Least squares data fitting with applications
- Low-rank approximation. Algorithms, implementation, applications
- Matrices, moments and quadrature with applications
- Multivariate reduced-rank regression
- Non-Parametric Detection of the Number of Signals: Hypothesis Testing and Random Matrix Theory
- Numerical methods for large eigenvalue problems
- On the Asymptotic Properties of LDU-Based Tests of the Rank of a Matrix
- Optimal estimation and rank detection for sparse spiked covariance matrices
- Optimal selection of reduced rank estimators of high-dimensional matrices
- Principal component analysis.
- Randomized algorithms for estimating the trace of an implicit symmetric positive semi-definite matrix
- Rang revealing QR factorizations
- Rank estimation in missing data matrix problems
- Rank estimation in reduced-rank regression
- Relations among some low-rank subspace recovery models
- Robust principal component analysis?
- Statistical Tests and Estimators of the Rank of a Matrix and Their Applications in Econometric Modelling
- TESTS OF RANK
- The University of Florida sparse matrix collection
- The kernel polynomial method
Cited in
(4)
This page was built for publication: Fast Estimation of Approximate Matrix Ranks Using Spectral Densities
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5380703)