How entropic regression beats the outliers problem in nonlinear system identification

From MaRDI portal
Publication:5218142




Abstract: In this work, we developed a nonlinear System Identification (SID) method that we called Entropic Regression. Our method adopts an information-theoretic measure for the data-driven discovery of the underlying dynamics. Our method shows robustness toward noise and outliers and it outperforms many of the current state-of-the-art methods. Moreover, the method of Entropic Regression overcomes many of the major limitations of the current methods such as sloppy parameters, diverse scale, and SID in high dimensional systems such as complex networks. The use of information-theoretic measures in entropic regression poses unique advantages, due to the Asymptotic Equipartition Property (AEP) of probability distributions, that outliers and other low-occurrence events are conveniently and intrinsically de-emphasized as not-typical, by definition. We provide a numerical comparison with the current state-of-the-art methods in sparse regression, and we apply the methods to different chaotic systems such as the Lorenz System, the Kuramoto-Sivashinsky equations, and the Double Well Potential.



Cites work



Describes a project that uses

Uses Software





This page was built for publication: How entropic regression beats the outliers problem in nonlinear system identification

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5218142)