Probabilistic learning inference of boundary value problem with uncertainties based on Kullback-Leibler divergence under implicit constraints
Publication:2142219
DOI10.1016/j.cma.2022.115078OpenAlexW4280528135MaRDI QIDQ2142219
Publication date: 27 May 2022
Published in: Computer Methods in Applied Mechanics and Engineering (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2202.05112
Kullback-Leibler divergencestochastic homogenizationprobabilistic learninguncertainty quantificationstatistical inverse problemimplicit constraints
Computational methods for problems pertaining to statistics (62-08) Probabilistic models, generic numerical methods in probability and statistics (65C20)
Related Items (4)
Uses Software
Cites Work
- On Information and Sufficiency
- Probabilistic learning on manifolds (PLoM) with partition
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Basis adaptation in homogeneous chaos spaces
- Time-domain formulation in computational dynamics for linear viscoelastic media with model uncertainties and stochastic excitation
- On the statistical dependence for the components of random elasticity tensors exhibiting material symmetry properties
- Kullback-Leibler upper confidence bounds for optimal sequential allocation
- Sparse regression learning by aggregation and Langevin Monte-Carlo
- Uncertainty quantification. An accelerated course with advanced applications in computational engineering
- Adaptive sparse polynomial chaos expansion based on least angle regression
- Periodization of random media and representative volume element size for linear composites
- A computational inverse method for identification of non-Gaussian random fields using the Bayesian approach in very high dimension
- A robust and efficient stepwise regression method for building sparse polynomial chaos expansions
- Approximate Bayesian computational methods
- Bayesian inference with optimal maps
- Data-driven probability concentration and sampling on manifold
- Stochastic spectral methods for efficient Bayesian solution of inverse problems
- Statistical volume element method for predicting microstructure-constitutive property relations
- Dimensionality reduction and polynomial chaos acceleration of Bayesian inference in inverse problems
- Identification of chaos representations of elastic properties of random media using experimental vibration tests
- Generalizing the finite element method: Diffuse approximation and diffuse elements
- Higher-order implicit strong numerical schemes for stochastic differential equations
- An algorithm for finding the distribution of maximal entropy
- Efficient global optimization of expensive black-box functions
- Random field models of heterogeneous materials
- Meshless methods: An overview and recent developments
- A micromechanics-based nonlocal constitutive equation and estimates of representative volume element size for elastic composites.
- Meshless methods based on collocation with radial basis functions
- Reduced Wiener chaos representation of random fields via basis adaptation and projection
- Machine learning of linear differential equations using Gaussian processes
- Itô-SDE MCMC method for Bayesian characterization of errors associated with data limitations in stochastic expansion methods for uncertainty quantification
- Hidden physics models: machine learning of nonlinear partial differential equations
- Projection-based model reduction: formulations for physics-based machine learning
- Statistical and computational inverse problems.
- DGM: a deep learning algorithm for solving partial differential equations
- Data-driven surrogates for high dimensional models using Gaussian process regression on the Grassmann manifold
- Hierarchical deep learning neural network (HiDeNN): an artificial intelligence (AI) framework for computational science and engineering
- Adaptive method for indirect identification of the statistical properties of random fields in a Bayesian framework
- A Bayesian perspective of statistical machine learning for big data
- Sampling of Bayesian posteriors with a non-Gaussian probabilistic learning on manifolds from a small dataset
- Probabilistic learning on manifolds constrained by nonlinear partial differential equations for small datasets
- A neural network-based surrogate model for carbon nanotubes with geometric nonlinearities
- Physics-informed neural networks: a deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
- Clustering discretization methods for generation of material performance databases in machine learning and design optimization
- A comparison of variational approximations for fast inference in mixed logit models
- Variational Markov chain Monte Carlo for Bayesian smoothing of non-linear diffusions
- Computational stochastic homogenization of heterogeneous media from an elasticity random field having an uncertain spectral measure
- Bayesian Calibration of Computer Models
- On the Brittleness of Bayesian Inference
- Stochastic Model and Generator for Random Fields with Symmetry Properties: Application to the Mesoscopic Modeling of Elastic Random Media
- Inverse problems: A Bayesian perspective
- Handbook of Uncertainty Quantification
- Statistical learning and selective inference
- Computational Aspects for Constructing Realizations of Polynomial Chaos in High Dimension
- On the Use of Stochastic Hessian Information in Optimization Methods for Machine Learning
- DESIGN UNDER UNCERTAINTY EMPLOYING STOCHASTIC EXPANSION METHODS
- Multi-Element Generalized Polynomial Chaos for Arbitrary Probability Measures
- Maximum likelihood estimation of stochastic chaos representations from experimental data
- Polynomial Chaos Expansion of a Multimodal Random Vector
- Construction of probability distributions in high dimension using the maximum entropy principle: Applications to stochastic processes, random fields and random matrices
- Homogenization and Two-Scale Convergence
- A General Convergence Result for a Functional Related to the Theory of Homogenization
- Solving Nonlinear Equations with Newton's Method
- Geometric numerical integration illustrated by the Störmer–Verlet method
- A pure Hubbard model with demonstrable pairing adjacent to the Mott-insulating phase
- Surface remeshing by local hermite diffuse interpolation
- Probability
- Bayesian adaptation of chaos representations using variational inference and sampling on geodesics
- Computational Statistics
- Physical Systems with Random Uncertainties: Chaos Representations with Arbitrary Probability Measure
- The Wiener--Askey Polynomial Chaos for Stochastic Differential Equations
- Identification of Polynomial Chaos Representations in High Dimension from a Set of Realizations
- An Introduction to Statistical Learning
- Random field representations for stochastic elliptic boundary value problems and statistical inverse problems
- Sparse Polynomial Chaos Expansions: Literature Survey and Benchmark
- Stochastic elliptic operators defined by non-Gaussian random fields with uncertain spectrum
- Compressed Principal Component Analysis of Non-Gaussian Vectors
- A modern maximum-likelihood theory for high-dimensional logistic regression
- Bayesian Numerical Homogenization
- Goal-Oriented Optimal Approximations of Bayesian Linear Inverse Problems
- Calculation of Lagrange Multipliers in the Construction of Maximum Entropy Distributions in High Stochastic Dimension
- Numerical Methods for Second‐Order Stochastic Differential Equations
- Elements of Information Theory
- Computer Vision - ECCV 2004
- Expansion of the global error for numerical schemes solving stochastic differential equations
This page was built for publication: Probabilistic learning inference of boundary value problem with uncertainties based on Kullback-Leibler divergence under implicit constraints