Dimensionality-Dependent Generalization Bounds for k-Dimensional Coding Schemes
From MaRDI portal
Publication:5380581
DOI10.1162/NECO_a_00872zbMath1474.68315arXiv1601.00238OpenAlexW2223051473WikidataQ39621250 ScholiaQ39621250MaRDI QIDQ5380581
Dacheng Tao, Dong Xu, Tongliang Liu
Publication date: 5 June 2019
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1601.00238
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Fast rates for empirical vector quantization
- Probability inequalities for empirical processes and a law of the iterated logarithm
- A central limit theorem for k-means clustering
- Sharper bounds for Gaussian and empirical processes
- Nonasymptotic bounds for vector quantization in Hilbert spaces
- On the mathematical foundations of learning
- Sample Complexity of Dictionary Learning and Other Matrix Factorizations
- Distance Learning in Discriminative Vector Quantization
- Improved Minimax Bounds on the Test and Training Distortion of Empirically Designed Vector Quantizers
- Individual Convergence Rates in Empirical Vector Quantizer Design
- On the Performance of Clustering in Hilbert Spaces
- Nonnegative Matrix Factorization with the Itakura-Saito Divergence: With Application to Music Analysis
- Adaptive Relevance Matrices in Learning Vector Quantization
- Atomic Decomposition by Basis Pursuit
- On the training distortion of vector quantizers
- A local search approximation algorithm for k-means clustering
- NeNMF: An Optimal Gradient Method for Nonnegative Matrix Factorization
- The minimax distortion redundancy in empirical quantizer design
- 10.1162/153244302760200713
- 10.1162/153244303321897690
- Unsupervised Spike Detection and Sorting with Wavelets and Superparamagnetic Clustering
- Rates of convergence in the source coding theorem, in empirical quantizer design, and in universal lossy source coding
- Semidefinite Programming Based Preconditioning for More Robust Near-Separable Nonnegative Matrix Factorization
- $K$-Dimensional Coding Schemes in Hilbert Spaces
- Sparse Coding on the Spot: Spontaneous Retinal Waves Suffice for Orientation Selectivity
- Probability Inequalities for Sums of Bounded Random Variables
- Learning the parts of objects by non-negative matrix factorization
- Manifold Regularized Discriminative Nonnegative Matrix Factorization With Fast Gradient Descent
- Improved Sparse Coding Under the Influence of Perceptual Attention
- A Hebbian/Anti-Hebbian Neural Network for Linear Subspace Learning: A Derivation from Multidimensional Scaling of Streaming Data
This page was built for publication: Dimensionality-Dependent Generalization Bounds for k-Dimensional Coding Schemes