Accelerated scale bridging with sparsely approximated Gaussian learning

From MaRDI portal
Publication:2222967

DOI10.1016/J.JCP.2019.109049zbMATH Open1453.62650arXiv1901.06777OpenAlexW2982104475WikidataQ126991956 ScholiaQ126991956MaRDI QIDQ2222967FDOQ2222967


Authors: Ting Wang, K. Leiter, Petr Plecháč, Jaroslaw Knap Edit this on Wikidata


Publication date: 28 January 2021

Published in: Journal of Computational Physics (Search for Journal in Brave)

Abstract: Multiscale modeling is a systematic approach to describe the behavior of complex systems by coupling models from different scales. The approach has been demonstrated to be very effective in areas of science as diverse as materials science, climate modeling and chemistry. However, routine use of multiscale simulations is often hindered by the very high cost of individual at-scale models. Approaches aiming to alleviate that cost by means of Gaussian process regression based surrogate models have been proposed. Yet, many of these surrogate models are expensive to construct, especially when the number of data needed is large. In this article, we employ a hierarchical sparse Cholesky decomposition to develop a sparse Gaussian process regression method and apply the method to approximate the equation of state of an energetic material in a multiscale model of dynamic deformation. We demonstrate that the method provides a substantial reduction both in computational cost and solution error as compared with previous methods.


Full work available at URL: https://arxiv.org/abs/1901.06777




Recommendations




Cites Work


Cited In (5)

Uses Software





This page was built for publication: Accelerated scale bridging with sparsely approximated Gaussian learning

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2222967)