A multidimensional Tauberian theorem for Laplace transforms of ultradistributions

From MaRDI portal
Publication:5220682

DOI10.1080/10652469.2019.1699556zbMATH Open1464.46048arXiv1902.00902OpenAlexW2912194815WikidataQ126575098 ScholiaQ126575098MaRDI QIDQ5220682FDOQ5220682


Authors: Lenny Neyt, J. Vindas Edit this on Wikidata


Publication date: 27 March 2020

Published in: Integral Transforms and Special Functions (Search for Journal in Brave)

Abstract: We obtain a multidimensional Tauberian theorem for Laplace transforms of Gelfand-Shilov ultradistributions. The result is derived from a Laplace transform characterization of bounded sets in spaces of ultradistributions with supports in a convex acute cone of mathbbRn, also established here.


Full work available at URL: https://arxiv.org/abs/1902.00902




Recommendations




Cites Work


Cited In (5)





This page was built for publication: A multidimensional Tauberian theorem for Laplace transforms of ultradistributions

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5220682)