Removing numerical dispersion from linear evolution equations

From MaRDI portal
Publication:2665622




Abstract: We describe a method for removing the numerical errors in the modeling of linear evolution equations that are caused by approximating the time derivative by a finite difference operator. The method is based on integral transforms realized as certain Fourier integral operators, called time dispersion transforms, and we prove that, under an assumption about the frequency content, it yields a solution with correct evolution throughout the entire lifespan. We demonstrate the method on a model equation as well as on the simulation of elastic and viscoelastic wave propagation.









This page was built for publication: Removing numerical dispersion from linear evolution equations

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2665622)