Removing numerical dispersion from linear evolution equations

From MaRDI portal
Publication:2665622

DOI10.2140/PAA.2021.3.253zbMATH Open1481.65163arXiv1906.10743OpenAlexW3200394801MaRDI QIDQ2665622FDOQ2665622


Authors: Jens Wittsten, Erik F. M. Koene, Fredrik Andersson, Johan O. A. Robertsson Edit this on Wikidata


Publication date: 19 November 2021

Published in: Pure and Applied Analysis (Search for Journal in Brave)

Abstract: We describe a method for removing the numerical errors in the modeling of linear evolution equations that are caused by approximating the time derivative by a finite difference operator. The method is based on integral transforms realized as certain Fourier integral operators, called time dispersion transforms, and we prove that, under an assumption about the frequency content, it yields a solution with correct evolution throughout the entire lifespan. We demonstrate the method on a model equation as well as on the simulation of elastic and viscoelastic wave propagation.


Full work available at URL: https://arxiv.org/abs/1906.10743




Recommendations




Cites Work






This page was built for publication: Removing numerical dispersion from linear evolution equations

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2665622)