The Random Feature Model for Input-Output Maps between Banach Spaces
From MaRDI portal
Publication:3382802
Recommendations
- Regularized learning schemes in feature Banach spaces
- Learning Theory
- Reproducing kernel Banach spaces for machine learning
- Banach space representer theorems for neural networks and ridge splines
- scientific article; zbMATH DE number 1283993
- scientific article; zbMATH DE number 1794211
- Learning with reproducing kernel Banach spaces
- Regularized learning in Banach spaces as an optimization problem: representer theorems
- Quasi-Banach Spaces of Random Variables and Modeling of Stochastic Processes
- Random Banach spaces: The limitations of the method
Cites work
- scientific article; zbMATH DE number 1260326 (Why is no real title available?)
- scientific article; zbMATH DE number 5681750 (Why is no real title available?)
- scientific article; zbMATH DE number 5055767 (Why is no real title available?)
- A Data-Driven Stochastic Method for Elliptic PDEs with Random Coefficients
- A least-squares approximation of partial differential equations with high-dimensional random inputs
- A mean-field optimal control formulation of deep learning
- A physics-informed operator regression framework for extracting data-driven continuum models
- A proposal on machine learning via dynamical systems
- Adaptive finite element methods for elliptic equations with non-smooth coefficients
- Algorithms for Numerical Analysis in High Dimensions
- An `empirical interpolation' method: Application to efficient reduced-basis discretization of partial differential equations
- Approximation of high-dimensional parametric PDEs
- Bayesian deep convolutional encoder-decoder networks for surrogate modeling and uncertainty quantification
- Bayesian learning for neural networks
- Blow up and regularity for fractal Burgers equation
- ConvPDE-UQ: convolutional neural networks with quantified uncertainty for heterogeneous elliptic partial differential equations on varied domains
- DGM: a deep learning algorithm for solving partial differential equations
- Data driven approximation of parametrized PDEs by reduced basis and neural networks
- Data-driven deep learning of partial differential equations in modal space
- Data-driven forward discretizations for Bayesian inversion
- Deep UQ: learning deep neural network surrogate models for high dimensional uncertainty quantification
- Deep learning in high dimension: neural network expression rates for generalized polynomial chaos expansions in UQ
- Deep neural networks motivated by partial differential equations
- Elliptic partial differential equations of second order
- Fourth-Order Time-Stepping for Stiff PDEs
- Functional multi-layer perceptron: A nonlinear tool for functional data analysis
- Gaussian processes for machine learning.
- Hierarchical Bayesian level set inversion
- Kernel-based reconstructions for parametric PDEs
- Learning data-driven discretizations for partial differential equations
- MCMC methods for functions: modifying old algorithms to make them faster
- Machine learning from a continuous viewpoint. I
- Meta-learning pseudo-differential operators with deep neural networks
- Model reduction and approximation. Theory and algorithms
- Model reduction and neural networks for parametric PDEs
- Model reduction of dynamical systems on nonlinear manifolds using deep convolutional autoencoders
- Non-intrusive reduced order modeling of nonlinear problems using neural networks
- Numerical solution of the parametric diffusion equation by deep neural networks
- On Learning Vector-Valued Functions
- On the equivalence between kernel quadrature rules and random feature expansions
- On the mathematical foundations of learning
- Operator-valued kernels for learning from functional response data
- Optimal rates for the regularized least-squares algorithm
- Optimal weighted least-squares methods
- Optimization with PDE Constraints
- Physics-informed neural networks: a deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
- Reconciling modern machine-learning practice and the classical bias-variance trade-off
- Reproducing kernel Hilbert spaces for parametric partial differential equations
- Scattered Data Approximation
- Solving electrical impedance tomography with deep learning
- Sparse adaptive Taylor approximation algorithms for parametric and stochastic elliptic PDEs
- Spatial variation. 2nd ed
- Stable architectures for deep neural networks
- Survey of multifidelity methods in uncertainty propagation, inference, and optimization
- The deep Ritz method: a deep learning-based numerical algorithm for solving variational problems
- Theory of Reproducing Kernels
- Universal approximation bounds for superpositions of a sigmoidal function
- VECTOR VALUED REPRODUCING KERNEL HILBERT SPACES OF INTEGRABLE FUNCTIONS AND MERCER THEOREM
- Variational training of neural network approximations of solution maps for physical models
Cited in
(28)- Two-Layer Neural Networks with Values in a Banach Space
- Do ideas have shape? Idea registration as the continuous limit of artificial neural networks
- Koopman neural operator as a mesh-free solver of non-linear partial differential equations
- Learning about structural errors in models of complex dynamical systems
- Local approximation of operators
- Learning homogenization for elliptic operators
- Operator learning using random features: a tool for scientific computing
- Sparse Recovery of Elliptic Solvers from Matrix-Vector Products
- Reduced operator inference for nonlinear partial differential equations
- SPADE4: sparsity and delay embedding based forecasting of epidemics
- A framework for machine learning of model error in dynamical systems
- Iterated Kalman methodology for inverse problems
- Energy-dissipative evolutionary deep operator neural networks
- Learning phase field mean curvature flows with neural networks
- Fast macroscopic forcing method
- Optimal Dirichlet boundary control by Fourier neural operators applied to nonlinear optics
- Derivative-informed neural operator: an efficient framework for high-dimensional parametric derivative learning
- RandONets: shallow networks with random projections for learning linear and nonlinear operators
- The Random Feature Model for Input-Output Maps between Banach Spaces
- MIONet: Learning Multiple-Input Operators via Tensor Product
- Learning high-dimensional parametric maps via reduced basis adaptive residual networks
- Data-driven forward and inverse problems for chaotic and hyperchaotic dynamic systems based on two machine learning architectures
- Variational regularization in inverse problems and machine learning
- An enhanced V-cycle MgNet model for operator learning in numerical partial differential equations
- Convergence Rates for Learning Linear Operators from Noisy Data
- Large-scale Bayesian optimal experimental design with derivative-informed projected neural network
- Transferable neural networks for partial differential equations
- Multi-scale time-stepping of partial differential equations with transformers
This page was built for publication: The Random Feature Model for Input-Output Maps between Banach Spaces
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3382802)