Pages that link to "Item:Q2929996"
From MaRDI portal
The following pages link to Active Subspace Methods in Theory and Practice: Applications to Kriging Surfaces (Q2929996):
Displaying 50 items.
- Gradient free active subspace construction using Morris screening elementary effects (Q521467) (← links)
- Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier-Stokes simulations: a data-driven, physics-informed Bayesian approach (Q525932) (← links)
- Inverse regression-based uncertainty quantification algorithms for high-dimensional models: theory and practice (Q726930) (← links)
- Enhancing sparsity of Hermite polynomial expansions by iterative rotations (Q729367) (← links)
- Learning constitutive relations from indirect observations using deep neural networks (Q781968) (← links)
- Parametric domain decomposition for accurate reduced order models: applications of MP-LROM methodology (Q1636812) (← links)
- Kriging surrogate model with coordinate transformation based on likelihood and gradient (Q1675582) (← links)
- Basis adaptation and domain decomposition for steady-state partial differential equations with random coefficients (Q1684992) (← links)
- Stability analysis of thermo-acoustic nonlinear eigenproblems in annular combustors. Part II. Uncertainty quantification (Q1685120) (← links)
- Goal-oriented adaptive surrogate construction for stochastic inversion (Q1986228) (← links)
- A data-driven framework for sparsity-enhanced surrogates with arbitrary mutually dependent randomness (Q1987969) (← links)
- Systems of Gaussian process models for directed chains of solvers (Q1988026) (← links)
- Deep UQ: learning deep neural network surrogate models for high dimensional uncertainty quantification (Q2002273) (← links)
- A generalized active subspace for dimension reduction in mixed aleatory-epistemic uncertainty quantification (Q2020260) (← links)
- Embedded ridge approximations (Q2020990) (← links)
- Derivative-informed projected neural networks for high-dimensional parametric maps governed by PDEs (Q2060092) (← links)
- Active-subspace analysis of exceedance probability for shallow-water waves (Q2060951) (← links)
- Sparse mixture models inspired by ANOVA decompositions (Q2071472) (← links)
- A multi-fidelity polynomial chaos-greedy Kaczmarz approach for resource-efficient uncertainty quantification on limited budget (Q2072434) (← links)
- Surrogate assisted active subspace and active subspace assisted surrogate -- a new paradigm for high dimensional structural reliability analysis (Q2072474) (← links)
- Generalization bounds for sparse random feature expansions (Q2105118) (← links)
- Robust and resource-efficient identification of two hidden layer neural networks (Q2117339) (← links)
- Forward and backward uncertainty quantification with active subspaces: application to hypersonic flows around a cylinder (Q2122694) (← links)
- Clustered active-subspace based local Gaussian process emulator for high-dimensional and complex computer models (Q2134712) (← links)
- Artificial neural network based response surface for data-driven dimensional analysis (Q2137947) (← links)
- Generalized bounds for active subspaces (Q2180052) (← links)
- Surrogate modeling of high-dimensional problems via data-driven polynomial chaos expansions and sparse partial least square (Q2180429) (← links)
- An efficient dimension reduction for the Gaussian process emulation of two nested codes with functional outputs (Q2203403) (← links)
- A distributed active subspace method for scalable surrogate modeling of function valued outputs (Q2211746) (← links)
- Compressive sensing adaptation for polynomial chaos expansions (Q2214527) (← links)
- Kriging-sparse polynomial dimensional decomposition surrogate model with adaptive refinement (Q2214537) (← links)
- PLS-based adaptation for efficient PCE representation in high dimensions (Q2220565) (← links)
- Adaptive sparse polynomial dimensional decomposition for derivative-based sensitivity (Q2221403) (← links)
- Adaptive dimension reduction to accelerate infinite-dimensional geometric Markov chain Monte Carlo (Q2221416) (← links)
- Manifold learning for parameter reduction (Q2221444) (← links)
- A physics-aware, probabilistic machine learning framework for coarse-graining high-dimensional systems in the small data regime (Q2222510) (← links)
- Kriging-enhanced ensemble variational data assimilation for scalar-source identification in turbulent environments (Q2222548) (← links)
- Simulator-free solution of high-dimensional stochastic elliptic partial differential equations using deep neural networks (Q2223019) (← links)
- Accelerated basis adaptation in homogeneous chaos spaces (Q2246311) (← links)
- Bayesian learning of orthogonal embeddings for multi-fidelity Gaussian processes (Q2246340) (← links)
- Latent map Gaussian processes for mixed variable metamodeling (Q2246360) (← links)
- On the choice of the low-dimensional domain for global optimization via random embeddings (Q2301180) (← links)
- Divide and conquer: an incremental sparsity promoting compressive sampling approach for polynomial chaos expansions (Q2309799) (← links)
- A near-stationary subspace for ridge approximation (Q2310068) (← links)
- Calibration of a SEIR-SEI epidemic model to describe the zika virus outbreak in Brazil (Q2335754) (← links)
- Exploiting active subspaces to quantify uncertainty in the numerical simulation of the hyshot II scramjet (Q2374810) (← links)
- Dimension-independent likelihood-informed MCMC (Q2374891) (← links)
- Efficient parameter estimation for a methane hydrate model with active subspaces (Q2418684) (← links)
- A training set subsampling strategy for the reduced basis method (Q2666028) (← links)
- Computational graph completion (Q2671739) (← links)