How entropic regression beats the outliers problem in nonlinear system identification

From MaRDI portal
Publication:5218142

DOI10.1063/1.5133386zbMATH Open1433.93032arXiv1905.08061OpenAlexW2997092692WikidataQ89512435 ScholiaQ89512435MaRDI QIDQ5218142FDOQ5218142


Authors: Abd AlRahman R. AlMomani, Jie Sun, Erik Bollt Edit this on Wikidata


Publication date: 28 February 2020

Published in: Chaos: An Interdisciplinary Journal of Nonlinear Science (Search for Journal in Brave)

Abstract: In this work, we developed a nonlinear System Identification (SID) method that we called Entropic Regression. Our method adopts an information-theoretic measure for the data-driven discovery of the underlying dynamics. Our method shows robustness toward noise and outliers and it outperforms many of the current state-of-the-art methods. Moreover, the method of Entropic Regression overcomes many of the major limitations of the current methods such as sloppy parameters, diverse scale, and SID in high dimensional systems such as complex networks. The use of information-theoretic measures in entropic regression poses unique advantages, due to the Asymptotic Equipartition Property (AEP) of probability distributions, that outliers and other low-occurrence events are conveniently and intrinsically de-emphasized as not-typical, by definition. We provide a numerical comparison with the current state-of-the-art methods in sparse regression, and we apply the methods to different chaotic systems such as the Lorenz System, the Kuramoto-Sivashinsky equations, and the Double Well Potential.


Full work available at URL: https://arxiv.org/abs/1905.08061




Recommendations




Cites Work


Cited In (15)

Uses Software





This page was built for publication: How entropic regression beats the outliers problem in nonlinear system identification

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5218142)