Convergence Rates for Learning Linear Operators from Noisy Data
DOI10.1137/21m1442942zbMath1514.62075arXiv2108.12515OpenAlexW3196963853MaRDI QIDQ6109175
Andrew M. Stuart, Maarten V. de Hoop, Nicholas H. Nelsen, Nikola B. Kovachki
Publication date: 30 June 2023
Published in: SIAM/ASA Journal on Uncertainty Quantification (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2108.12515
Bayesian inferencestatistical learning theorylinear inverse problemsposterior consistencyoperator learningdistribution shift
Asymptotic properties of nonparametric inference (62G20) Bayesian problems; characterization of Bayes procedures (62C10) Learning and adaptive systems in artificial intelligence (68T05) Equations involving linear operators, with operator unknowns (47A62)
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Discovering governing equations from data by sparse identification of nonlinear dynamical systems
- An Analysis of the Total Least Squares Problem
- Bayesian inverse problems with non-conjugate priors
- Bayesian inverse problems with Gaussian priors
- Efficient nonparametric Bayesian inference for \(X\)-ray transforms
- On statistical Calderón problems
- Prediction in functional linear regression
- Functional regression with repeated eigenvalues
- SubGaussian random variables in Hilbert spaces
- A general approach to posterior contraction in nonparametric inverse problems
- Bayesian posterior contraction rates for linear severely ill-posed inverse problems
- A physics-informed operator regression framework for extracting data-driven continuum models
- Model reduction and neural networks for parametric PDEs
- Derivative-informed projected neural networks for high-dimensional parametric maps governed by PDEs
- Deep learning architectures for nonlinear operator functions and nonlinear inverse problems
- Data-driven approximation of the Koopman generator: model reduction, system identification, and control
- Designing truncated priors for direct and inverse Bayesian problems
- Convergence analysis of Tikhonov regularization for non-linear statistical inverse problems
- Solving electrical impedance tomography with deep learning
- Bayesian linear inverse problems in regularity scales
- Stability of the non-abelian \(X\)-ray transform in dimension \(\ge 3\)
- Eigendecompositions of transfer operators in reproducing kernel Hilbert spaces
- Data-driven spectral decomposition and forecasting of ergodic dynamical systems
- Convergence types and rates in generic Karhunen-Loève expansions with applications to sample path properties
- Optimal rates for the regularized least-squares algorithm
- Asymptotics of prediction in functional linear regression with functional outputs
- Posterior contraction rates for the Bayesian approach to linear ill-posed inverse problems
- Functional data analysis.
- Learning elliptic partial differential equations with randomized linear algebra
- Inverse problems: A Bayesian perspective
- Analysis of the Gibbs Sampler for Hierarchical Inverse Problems
- Stochastic Differential Equations in Infinite Dimensions
- The Random Feature Model for Input-Output Maps between Banach Spaces
- Nonparametric statistical inverse problems
- Posterior Contraction in Bayesian Inverse Problems Under Gaussian Priors
- Bayesian inverse problems with unknown operators
- Deep learning in high dimension: Neural network expression rates for generalized polynomial chaos expansions in UQ
- High-Dimensional Statistics
- High-Dimensional Probability
- Linear and Nonlinear Inverse Problems with Practical Applications
- Linear inverse problems for generalised random variables
- Linear estimators and measurable linear transformations on a Hilbert space
- A double regularization approach for inverse problems with noisy data and inexact operator
- Bernstein--von Mises Theorems and Uncertainty Quantification for Linear Inverse Problems
- Convergence Rates for Penalized Least Squares Estimators in PDE Constrained Regression Problems
- Consistent Inversion of Noisy <scp>Non‐Abelian X‐Ray</scp> Transforms
- Computing Spectral Measures of Self-Adjoint Operators
- Excess risk bounds in robust empirical risk minimization
- A Rigorous Theory of Conditional Mean Embeddings
- Two-Layer Neural Networks with Values in a Banach Space
- Error estimates for DeepONets: a deep learning framework in infinite dimensions
- Data driven regularization by projection
- A Note on Estimation in Hilbertian Linear Models
- Solving inverse problems using data-driven models
- MAP estimators and their consistency in Bayesian nonparametric inverse problems
- Deep Neural Networks for Inverse Problems with Pseudodifferential Operators: An Application to Limited-Angle Tomography
- Bayesian Inference of an Uncertain Generalized Diffusion Operator
- Bridging and Improving Theoretical and Computational Electrical Impedance Tomography via Data Completion