Relative entropy minimizing noisy non-linear neural network to approximate stochastic processes

From MaRDI portal
Publication:889265

DOI10.1016/J.NEUNET.2014.04.002zbMATH Open1347.60063arXiv1402.1613OpenAlexW1978148674WikidataQ51088425 ScholiaQ51088425MaRDI QIDQ889265FDOQ889265


Authors: Mathieu N. Galtier, Camille Marini, Gilles Wainrib, Herbert Jaeger Edit this on Wikidata


Publication date: 6 November 2015

Published in: Neural Networks (Search for Journal in Brave)

Abstract: A method is provided for designing and training noise-driven recurrent neural networks as models of stochastic processes. The method unifies and generalizes two known separate modeling approaches, Echo State Networks (ESN) and Linear Inverse Modeling (LIM), under the common principle of relative entropy minimization. The power of the new method is demonstrated on a stochastic approximation of the El Nino phenomenon studied in climate research.


Full work available at URL: https://arxiv.org/abs/1402.1613




Recommendations




Cites Work


Cited In (3)

Uses Software





This page was built for publication: Relative entropy minimizing noisy non-linear neural network to approximate stochastic processes

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q889265)