An efficient methodology for the analysis and modeling of computer experiments with large number of inputs

From MaRDI portal
Publication:6285871

arXiv1704.07090MaRDI QIDQ6285871FDOQ6285871


Authors: B. Iooss, Amandine Marrel Edit this on Wikidata


Publication date: 24 April 2017

Abstract: Complex computer codes are often too time expensive to be directly used to perform uncertainty, sensitivity, optimization and robustness analyses. A widely accepted method to circumvent this problem consists in replacing cpu-time expensive computer models by cpu inexpensive mathematical functions, called metamodels. For example, the Gaussian process (Gp) model has shown strong capabilities to solve practical problems , often involving several interlinked issues. However, in case of high dimensional experiments (with typically several tens of inputs), the Gp metamodel building process remains difficult, even unfeasible, and application of variable selection techniques cannot be avoided. In this paper, we present a general methodology allowing to build a Gp metamodel with large number of inputs in a very efficient manner. While our work focused on the Gp metamodel, its principles are fully generic and can be applied to any types of metamodel. The objective is twofold: estimating from a minimal number of computer experiments a highly predictive metamodel. This methodology is successfully applied on an industrial computer code.













This page was built for publication: An efficient methodology for the analysis and modeling of computer experiments with large number of inputs

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6285871)