Statistical-computational trade-offs in tensor PCA and related problems via communication complexity
From MaRDI portal
Publication:6151966
Abstract: Tensor PCA is a stylized statistical inference problem introduced by Montanari and Richard to study the computational difficulty of estimating an unknown parameter from higher-order moment tensors. Unlike its matrix counterpart, Tensor PCA exhibits a statistical-computational gap, i.e., a sample size regime where the problem is information-theoretically solvable but conjectured to be computationally hard. This paper derives computational lower bounds on the run-time of memory bounded algorithms for Tensor PCA using communication complexity. These lower bounds specify a trade-off among the number of passes through the data sample, the sample size, and the memory required by any algorithm that successfully solves Tensor PCA. While the lower bounds do not rule out polynomial-time algorithms, they do imply that many commonly-used algorithms, such as gradient descent and power method, must have a higher iteration count when the sample size is not large enough. Similar lower bounds are obtained for Non-Gaussian Component Analysis, a family of statistical estimation problems in which low-order moment tensors carry no information about the unknown parameter. Finally, stronger lower bounds are obtained for an asymmetric variant of Tensor PCA and related statistical estimation problems. These results explain why many estimators for these problems use a memory state that is significantly larger than the effective dimensionality of the parameter of interest.
Recommendations
Cites work
- scientific article; zbMATH DE number 7306916 (Why is no real title available?)
- Algorithmic thresholds for tensor PCA
- Communication lower bounds for statistical estimation problems via a distributed data processing inequality
- Computational barriers to estimation from low-degree polynomials
- Concentration inequalities. A nonasymptotic theory of independence
- Continuous LWE
- Efficient algorithms and lower bounds for robust linear regression
- Efficient noise-tolerant learning from statistical queries
- Entropy samplers and strong generic lower bounds for space bounded learning
- Extractor-based time-space lower bounds for learning
- Fast learning requires good memory: a time-space lower bound for parity learning
- Fast spectral algorithms from sum-of-squares proofs: tensor decomposition and planted sparse vectors
- Geometric Lower Bounds for Distributed Parameter Estimation Under Communication Constraints
- HIGH DIMENSIONAL ESTIMATION VIA SUM-OF-SQUARES PROOFS
- In search of non-Gaussian components of a high-dimensional distribution
- Lattice-based Cryptography
- Memory-sample tradeoffs for linear regression with small error
- Non-Gaussian component analysis using entropy methods
- Notes on computational hardness of hypothesis testing: predictions using the low-degree likelihood ratio
- Notes on computational-to-statistical gaps: predictions using statistical physics
- On Bayes risk lower bounds
- On the complexity of random satisfiability problems with planted solutions
- Statistical algorithms and a lower bound for detecting planted cliques
- Statistical query lower bounds for tensor PCA
- Sum-of-squares certificates for maxima of random tensors on the sphere
- Tensor SVD: Statistical and Computational Limits
- The landscape of the spiked tensor model
- The space complexity of approximating the frequency moments
- Time-space hardness of learning sparse parities
This page was built for publication: Statistical-computational trade-offs in tensor PCA and related problems via communication complexity
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6151966)