Sparse polynomial chaos expansions using variational relevance vector machines

From MaRDI portal
Publication:781971

DOI10.1016/J.JCP.2020.109498zbMATH Open1437.62114arXiv1912.11029OpenAlexW3021864506MaRDI QIDQ781971FDOQ781971


Authors: Panagiotis Tsilifis, Iason Papaioannou, Daniel Straub, F. Nobile Edit this on Wikidata


Publication date: 21 July 2020

Published in: Journal of Computational Physics (Search for Journal in Brave)

Abstract: The challenges for non-intrusive methods for Polynomial Chaos modeling lie in the computational efficiency and accuracy under a limited number of model simulations. These challenges can be addressed by enforcing sparsity in the series representation through retaining only the most important basis terms. In this work, we present a novel sparse Bayesian learning technique for obtaining sparse Polynomial Chaos expansions which is based on a Relevance Vector Machine model and is trained using Variational Inference. The methodology shows great potential in high-dimensional data-driven settings using relatively few data points and achieves user-controlled sparse levels that are comparable to other methods such as compressive sensing. The proposed approach is illustrated on two numerical examples, a synthetic response function that is explored for validation purposes and a low-carbon steel plate with random Young's modulus and random loading, which is modeled by stochastic finite element with 38 input random variables.


Full work available at URL: https://arxiv.org/abs/1912.11029




Recommendations




Cites Work


Cited In (21)

Uses Software





This page was built for publication: Sparse polynomial chaos expansions using variational relevance vector machines

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q781971)