Learning about structural errors in models of complex dynamical systems
From MaRDI portal
Publication:6572173
DOI10.1016/J.JCP.2024.113157MaRDI QIDQ6572173FDOQ6572173
Authors: Jin-Long Wu, Matthew E. Levine, Tapio Schneider, A. M. Stuart
Publication date: 15 July 2024
Published in: Journal of Computational Physics (Search for Journal in Brave)
Recommendations
- A framework for machine learning of model error in dynamical systems
- A causality-based learning approach for discovering the underlying dynamics of complex systems from partial observations with stochastic parameterization
- Learning and correcting non-Gaussian model errors
- Learning stochastic closures using ensemble Kalman inversion
- Deep learning of dynamics and signal-noise decomposition with time-stepping constraints
Artificial intelligence (68Txx) Probabilistic methods, stochastic differential equations (65Cxx) Inference from stochastic processes (62Mxx)
Cites Work
- Algorithm 808
- Bayesian calibration of computer models. (With discussion)
- Gaussian processes for machine learning.
- Title not available (Why is that?)
- Langevin diffusions and Metropolis-Hastings algorithms
- Discovering governing equations from data by sparse identification of nonlinear dynamical systems
- Efficient estimation of stochastic volatility using noisy observations: a multi-scale approach
- Sparse dynamics for partial differential equations
- A Tale of Two Time Scales
- Compressed sensing
- Combining Field Data and Computer Simulations for Calibration and Prediction
- Embedology
- Analysis and approximation of nonlocal diffusion problems with volume constraints
- A computational strategy for multiscale systems with applications to Lorenz 96 model
- Nonlocal diffusion and applications
- Nonequilibrium statistical mechanics
- Nonparametric estimation of diffusions: a differential equations approach
- Parameter estimation for partially observed hypoelliptic diffusions
- On inference for partially observed nonlinear diffusion models using the Metropolis-Hastings algorithm
- Maximum likelihood drift estimation for multiscale diffusions
- Learning about physical parameters: the importance of model discrepancy
- When is a model good enough? Deriving the expected value of model improvement via specifying internal model discrepancies
- Ensemble Kalman methods for inverse problems
- Parameter estimation for multiscale diffusions
- Ensemble Kalman methods with constraints
- Estimation of parameters and eigenmodes of multivariate autoregressive models
- Semiparametric drift and diffusion estimation for multiscale diffusions
- Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier-Stokes simulations: a data-driven, physics-informed Bayesian approach
- Turbulence modeling in the age of data
- Finite sampling interval effects in Kramers-Moyal analysis
- Iterative updating of model error for Bayesian inversion
- Numerical strategy for model correction using physical constraints
- A new framework for extracting coarse-grained models from time series with multiscale structure
- A novel evolutionary algorithm applied to algebraic modifications of the RANS stress-strain relationship
- Reynolds averaged turbulence modelling using deep neural networks with embedded invariance
- Recurrent neural network closure of parametric POD-Galerkin reduced-order models based on the Mori-Zwanzig formalism
- RANS turbulence model development using CFD-driven machine learning
- Learning nonlocal constitutive models with neural networks
- Machine learning for fluid mechanics
- Statistical Analysis with Missing Data, Third Edition
- Nonlocal Modeling, Analysis, and Computation
- A framework for machine learning of model error in dynamical systems
- Large-scale sparse inverse covariance matrix estimation
- Subgrid modelling for two-dimensional turbulence using neural networks
- Learning partial differential equations via data discovery and sparse optimization
- Data-driven model reduction, Wiener projections, and the Koopman-Mori-Zwanzig formalism
- Ensemble Kalman inversion for sparse learning of dynamical systems from time-averaged data
- nPINNs: nonlocal physics-informed neural networks for a parametrized nonlocal universal Laplacian operator. Algorithms and applications
- Calibrate, emulate, sample
- Learning dynamical systems from data: a simple cross-validation perspective. I: Parametric kernel flows
- Learning dynamical systems from data: a simple cross-validation perspective. III: Irregularly-sampled time series
- Solving and learning nonlinear PDEs with Gaussian processes
- The Random Feature Model for Input-Output Maps between Banach Spaces
- Learning stochastic closures using ensemble Kalman inversion
- Ensemble inference methods for models with noisy and expensive likelihoods
- Localized ensemble Kalman inversion
- Autodifferentiable ensemble Kalman filters
- Inverse Problems and Data Assimilation
- Ensemble Kalman method for learning turbulence models from indirect observation data
- EMBEDDED MODEL ERROR REPRESENTATION FOR BAYESIAN MODEL CALIBRATION
- Parameter estimation for multiscale diffusions: an overview
- Error modeling for surrogates of dynamical systems using machine learning
- Model selection of chaotic systems from data with hidden variables using sparse data assimilation
Cited In (1)
This page was built for publication: Learning about structural errors in models of complex dynamical systems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6572173)