Boolean autoencoders and hypercube clustering complexity
From MaRDI portal
Learning and adaptive systems in artificial intelligence (68T05) Computational difficulty of problems (lower bounds, completeness, difficulty of approximation, etc.) (68Q17) Coding and information theory (compaction, compression, models of communication, encoding schemes, etc.) (aspects in computer science) (68P30) Analytic circuit theory (94C05)
Recommendations
Cites work
- scientific article; zbMATH DE number 3567782 (Why is no real title available?)
- scientific article; zbMATH DE number 3586931 (Why is no real title available?)
- scientific article; zbMATH DE number 3639144 (Why is no real title available?)
- $B$-valuations of graphs
- A Clustering and Data-Reorganizing Algorithm
- A Fast Learning Algorithm for Deep Belief Nets
- Clustering by passing messages between data points
- Complex-valued autoencoders
- Cubical graphs and cubical dimensions
- Deep, Narrow Sigmoid Belief Networks Are Universal Approximators
- Learning representations by back-propagating errors
- On the Complexity of Some Common Geometric Location Problems
- On the complexity of some coding problems (Corresp.)
- On the inherent intractability of certain coding problems (Corresp.)
- Proof of the squashed cube conjecture
- Reducing the Dimensionality of Data with Neural Networks
- The Planar k-Means Problem is NP-Hard
- The complexity of cubical graphs
- The homeomorphic embedding of \(K_n\) in the \(m\)-cube
- The intractability of computing the minimum distance of a code
- Why does unsupervised pre-training help deep learning?
This page was built for publication: Boolean autoencoders and hypercube clustering complexity
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q690669)