Learning to imitate stochastic time series in a compositional way by chaos
DOI10.1016/J.NEUNET.2009.12.006zbMATH Open1396.68099arXiv0805.1795OpenAlexW2024967153WikidataQ45774303 ScholiaQ45774303MaRDI QIDQ1784572FDOQ1784572
Authors: Jun Namikawa, Jun Tani
Publication date: 27 September 2018
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0805.1795
Recommendations
- Evaluating generation of chaotic time series by convolutional generative adversarial networks
- Learning chaotic dynamics by neural networks
- Generating replications of chaotic time series
- Noisy time series generation by feed-forward networks
- Learning latent dynamics for partially observed chaotic systems
- scientific article; zbMATH DE number 1886443
- A dilated convolution network-based LSTM model for multi-step prediction of chaotic time series
Time series, auto-correlation, regression, etc. in statistics (GARCH) (62M10) Learning and adaptive systems in artificial intelligence (68T05) Strange attractors, chaotic dynamics of systems with hyperbolic behavior (37D45) Time series analysis of dynamical systems (37M10)
Cites Work
- Learning Nonregular Languages: A Comparison of Simple Recurrent Networks and LSTM
- An Introduction to Symbolic Dynamics and Coding
- Chaotic itinerancy
- Title not available (Why is that?)
- Title not available (Why is that?)
- Learning to generate combinatorial action sequences utilizing the initial sensitivity of deterministic dynamical systems
- A model for learning to segment temporal sequences, utilizing a mixture of RNN experts together with adaptive variance
Cited In (3)
Uses Software
This page was built for publication: Learning to imitate stochastic time series in a compositional way by chaos
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1784572)