Fundamental aspects of noise in analog-hardware neural networks

From MaRDI portal
Publication:4973007

DOI10.1063/1.5120824zbMATH Open1425.92014arXiv1907.09002OpenAlexW3098895744WikidataQ91058100 ScholiaQ91058100MaRDI QIDQ4973007FDOQ4973007

Author name not available (Why is that?)

Publication date: 29 November 2019

Published in: Chaos: An Interdisciplinary Journal of Nonlinear Science (Search for Journal in Brave)

Abstract: We study and analyze the fundamental aspects of noise propagation in recurrent as well as deep, multi-layer networks. The main focus of our study are neural networks in analogue hardware, yet the methodology provides insight for networks in general. The system under study consists of noisy linear nodes, and we investigate the signal-to-noise ratio at the network's outputs which is the upper limit to such a system's computing accuracy. We consider additive and multiplicative noise which can be purely local as well as correlated across populations of neurons. This covers the chief internal-perturbations of hardware networks and noise amplitudes were obtained from a physically implemented recurrent neural network and therefore correspond to a real-world system. Analytic solutions agree exceptionally well with numerical data, enabling clear identification of the most critical components and aspects for noise management. Focusing on linear nodes isolates the impact of network connections and allows us to derive strategies for mitigating noise. Our work is the starting point in addressing this aspect of analogue neural networks, and our results identify notoriously sensitive points while simultaneously highlighting the robustness of such computational systems.


Full work available at URL: https://arxiv.org/abs/1907.09002





Cites Work


Cited In (5)






This page was built for publication: Fundamental aspects of noise in analog-hardware neural networks

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4973007)