A topological insight into restricted Boltzmann machines
From MaRDI portal
Publication:331673
Abstract: Restricted Boltzmann Machines (RBMs) and models derived from them have been successfully used as basic building blocks in deep artificial neural networks for automatic features extraction, unsupervised weights initialization, but also as density estimators. Thus, their generative and discriminative capabilities, but also their computational time are instrumental to a wide range of applications. Our main contribution is to look at RBMs from a topological perspective, bringing insights from network science. Firstly, here we show that RBMs and Gaussian RBMs (GRBMs) are bipartite graphs which naturally have a small-world topology. Secondly, we demonstrate both on synthetic and real-world datasets that by constraining RBMs and GRBMs to a scale-free topology (while still considering local neighborhoods and data distribution), we reduce the number of weights that need to be computed by a few orders of magnitude, at virtually no loss in generative performance. Thirdly, we show that, for a fixed number of weights, our proposed sparse models (which by design have a higher number of hidden neurons) achieve better generative capabilities than standard fully connected RBMs and GRBMs (which by design have a smaller number of hidden neurons), at no additional computational costs.
Recommendations
- Restricted Boltzmann machines: a review
- Restricted Boltzmann machines
- Topological approaches to deep learning
- Restricted Boltzmann machines: introduction and review
- scientific article; zbMATH DE number 6542830
- Representational Power of Restricted Boltzmann Machines and Deep Belief Networks
- Restricted Boltzmann machines as models of interacting variables
Cites work
- Collective dynamics of `small-world' networks
- Emergence of Scaling in Random Networks
- Exploring complex networks
- Learning deep architectures for AI
- On Realizability of a Set of Integers as Degrees of the Vertices of a Linear Graph. I
- Power-law distributions in empirical data
- Random graphs and complex networks. Volume 1
- Reducing the Dimensionality of Data with Neural Networks
- The flip-the-state transition operator for restricted Boltzmann machines
- Training Products of Experts by Minimizing Contrastive Divergence
Cited in
(7)- Quick and robust feature selection: the strength of energy-efficient sparse training for autoencoders
- scientific article; zbMATH DE number 7626756 (Why is no real title available?)
- A factor graph model for unsupervised feature selection
- The emergence of a concept in shallow neural networks
- A brain-inspired algorithm for training highly sparse neural networks
- Topologically ordered feature extraction based on sparse group restricted Boltzmann machines
- scientific article; zbMATH DE number 6542830 (Why is no real title available?)
This page was built for publication: A topological insight into restricted Boltzmann machines
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q331673)