Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data

From MaRDI portal
Revision as of 11:49, 7 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:4563888

DOI10.1063/1.5010300zbMath1390.37138arXiv1710.07313OpenAlexW2765128778WikidataQ47172677 ScholiaQ47172677MaRDI QIDQ4563888

Michelle Girvan, Brian R. Hunt, Zhixin Lu, Edward Ott, Jaideep Pathak

Publication date: 4 June 2018

Published in: Chaos: An Interdisciplinary Journal of Nonlinear Science (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1710.07313




Related Items (65)

Simple estimation method for the second-largest Lyapunov exponent of chaotic differential equationsRobustness of LSTM neural networks for multi-step forecasting of chaotic time seriesMachine-learning construction of a model for a macroscopic fluid variable using the delay-coordinate of a scalar observableSynchronization of reservoir computers with applications to communicationsUnnamed ItemData-driven forecasting of high-dimensional chaotic systems with long short-term memory networksModeling chaotic systems: dynamical equations vs machine learning approachDissecting cell fate dynamics in pediatric glioblastoma through the lens of complex systems and cellular cyberneticsDetection of generalized synchronization using echo state networksTime series analysis and prediction of nonlinear systems with ensemble learning framework applied to deep learning neural networksA framework for machine learning of model error in dynamical systemsFading memory echo state networks are universalRobust optimization and validation of echo state networks for learning chaotic dynamicsReservoir computing with error correction: long-term behaviors of stochastic dynamical systemsLearning dynamical systems from data: a simple cross-validation perspective. IV: Case with partial observationsAccuracy and architecture studies of residual neural network method for ordinary differential equationsApproximation bounds for random neural networks and reservoir systemsParsimony as the ultimate regularizer for physics-informed machine learningStability analysis of chaotic systems from dataLearning Theory for Dynamical SystemsLearning strange attractors with reservoir systemsInferring symbolic dynamics of chaotic flows from persistenceEnsemble forecasts in reproducing kernel Hilbert space familyLearning dynamics by reservoir computing (In Memory of Prof. Pavol Brunovský)Identification of chimera using machine learningInvertible generalized synchronization: A putative mechanism for implicit learning in neural systemsReducing network size and improving prediction stability of reservoir computingSeeking optimal parameters for achieving a lightweight reservoir computing: a computational endeavorUsing reservoir computer to predict and prevent extreme eventsMachine learning, alignment of covariant Lyapunov vectors, and predictability in Rikitake’s geomagnetic dynamo modelDeep learning of dynamics and signal-noise decomposition with time-stepping constraintsModeling the dynamics of PDE systems with physics-constrained deep auto-regressive networksAssessing observability of chaotic systems using Delay Differential AnalysisLearning dynamical systems in noise using convolutional neural networksA novel method based on the pseudo-orbits to calculate the largest Lyapunov exponent from chaotic equationsEmbedding and approximation theorems for echo state networksMeasuring Lyapunov exponents of large chaotic systems with global coupling by time series analysisIdentifying the linear region based on machine learning to calculate the largest Lyapunov exponent from chaotic time seriesReservoir Computing with an Inertial FormUnnamed ItemLearning the tangent space of dynamical instabilities from dataArtificial Intelligence, Chaos, Prediction and Understanding in ScienceUsing machine learning to predict extreme events in the Hénon mapCollective dynamics of rate neurons for supervised learning in a reservoir computing systemGood and bad predictions: Assessing and improving the replication of chaotic attractors by means of reservoir computingStability analysis of reservoir computers dynamics via Lyapunov functionsPredicting critical transitions in multiscale dynamical systems using reservoir computingBreaking symmetries of the reservoir equations in echo state networksUsing machine learning to predict statistical properties of non-stationary dynamical processes: System climate,regime transitions, and the effect of stochasticityInferring the dynamics of oscillatory systems using recurrent neural networksTransfer learning of chaotic systemsMultifunctionality in a reservoir computerError bounds of the invariant statistics in machine learning of ergodic Itô diffusionsLearning dynamical systems from data: a simple cross-validation perspective. I: Parametric kernel flowsEcho state networks trained by Tikhonov least squares are \(L^2(\mu)\) approximators of ergodic dynamical systemsUsing data assimilation to train a hybrid forecast system that combines machine-learning and knowledge-based componentsDetecting unstable periodic orbits based only on time series: When adaptive delayed feedback control meets reservoir computingThe reservoir’s perspective on generalized synchronizationUnnamed ItemChaos: From theory to applications for the 80th birthday of Otto E. RösslerSynchronization of reservoir computing models via a nonlinear controllerClustered and deep echo state networks for signal noise reductionGeneralized Cell Mapping Method with Deep Learning for Global Analysis and Response Prediction of Dynamical SystemsOne-shot learning of stochastic differential equations with data adapted kernelsChaotic diffusion of dissipative solitons: from anti-persistent random walks to hidden Markov models




Cites Work




This page was built for publication: Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data