Relative entropy minimizing noisy non-linear neural network to approximate stochastic processes
From MaRDI portal
Publication:889265
stochastic processesapproximationneural networksrelative entropyecho state networkslinear inverse modelingEl Niño phenomenon
Probabilistic models, generic numerical methods in probability and statistics (65C20) Neural networks for/in biological studies, artificial life and related topics (92B20) Stochastic processes (60G99) Meteorology and atmospheric physics (86A10) Other physical applications of random processes (60K40)
Abstract: A method is provided for designing and training noise-driven recurrent neural networks as models of stochastic processes. The method unifies and generalizes two known separate modeling approaches, Echo State Networks (ESN) and Linear Inverse Modeling (LIM), under the common principle of relative entropy minimization. The power of the new method is demonstrated on a stochastic approximation of the El Nino phenomenon studied in climate research.
Recommendations
- Stochastic Neural Networks With Applications to Nonlinear Time Series
- Metric entropy limits on recurrent neural network learning of linear dynamical systems
- Maximum-entropy approximations of stochastic nonlinear transductions: An extension of the Wiener theory
- scientific article; zbMATH DE number 2221002
- Modelling nonstationary dynamics
Cites work
- scientific article; zbMATH DE number 51724 (Why is no real title available?)
- scientific article; zbMATH DE number 3567782 (Why is no real title available?)
- A Fast Learning Algorithm for Deep Belief Nets
- A biological gradient descent for prediction through a combination of STDP and homeostatic plasticity
- A stochastic model of IndoPacific sea surface temperature anomalies
- Calibrating volatility surfaces via relative-entropy minimization
- Error analysis of Jacobi derivative estimators for noisy signals
- Gaussian processes for machine learning.
- On Information and Sufficiency
- On Transforming a Certain Class of Stochastic Processes by Absolutely Continuous Substitution of Measures
- Optimization and applications of echo state networks with leaky- integrator neurons
- Pattern recognition and machine learning.
- Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
- Reservoir computing approaches to recurrent neural network training
- Some Numerical Methods for Rare Events Simulation and Analysis
- Statistical Inference for Probabilistic Functions of Finite State Markov Chains
- The Fokker-Planck equation. Methods of solutions and applications.
Cited in
(3)
This page was built for publication: Relative entropy minimizing noisy non-linear neural network to approximate stochastic processes
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q889265)