K-Deep Simplex: Deep Manifold Learning via Local Dictionaries

From MaRDI portal
Publication:6355255




Abstract: We propose K-Deep Simplex (KDS) which, given a set of data points, learns a dictionary comprising synthetic landmarks, along with representation coefficients supported on a simplex. KDS integrates manifold learning and sparse coding/dictionary learning: reconstruction term, as in classical dictionary learning, and a novel local weighted ell1 penalty that encourages each data point to represent itself as a convex combination of nearby landmarks. We solve the proposed optimization program using alternating minimization and design an efficient, interpretable autoencoder using algorithm enrolling. We theoretically analyze the proposed program by relating the weighted ell1 penalty in KDS to a weighted ell0 program. Assuming that the data are generated from a Delaunay triangulation, we prove the equivalence of the weighted ell1 and weighted ell0 programs. If the representation coefficients are given, we prove that the resulting dictionary is unique. Further, we show that low-dimensional representations can be efficiently obtained from the covariance of the coefficient matrix. We apply KDS to the unsupervised clustering problem and prove theoretical performance guarantees. Experiments show that the algorithm is highly efficient and performs competitively on synthetic and real data sets.











This page was built for publication: K-Deep Simplex: Deep Manifold Learning via Local Dictionaries

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6355255)