Kernels for vector-valued functions: a review
From MaRDI portal
Publication:2903301
Abstract: Kernel methods are among the most popular techniques in machine learning. From a frequentist/discriminative perspective they play a central role in regularization theory as they provide a natural choice for the hypotheses space and the regularization functional through the notion of reproducing kernel Hilbert spaces. From a Bayesian/generative perspective they are the key in the context of Gaussian processes, where the kernel function is also known as the covariance function. Traditionally, kernel methods have been used in supervised learning problem with scalar outputs and indeed there has been a considerable amount of work devoted to designing and learning kernels. More recently there has been an increasing interest in methods that deal with multiple outputs, motivated partly by frameworks like multitask learning. In this paper, we review different methods to design or learn valid kernel functions for multiple outputs, paying particular attention to the connection between probabilistic and functional methods.
Recommendations
Cited in
(81)- Interpolation with uncoupled separable matrix-valued kernels
- Data-driven model order reduction for problems with parameter-dependent jump-discontinuities
- Grouped Gaussian processes for solar power prediction
- Learning nonparametric ordinary differential equations from noisy data
- A Bayesian decision framework for optimizing sequential combination antiretroviral therapy in people with HIV
- GParareal: a time-parallel ODE solver using Gaussian process emulation
- Computationally efficient convolved multiple output Gaussian processes
- Two-Layer Neural Networks with Values in a Banach Space
- Do ideas have shape? Idea registration as the continuous limit of artificial neural networks
- Multi-task learning in vector-valued reproducing kernel Banach spaces with the \(\ell^1\) norm
- A representer theorem for deep neural networks
- Bayesian estimation of large-scale simulation models with Gaussian process regression surrogates
- A non-intrusive multifidelity method for the reduced order modeling of nonlinear problems
- Gaussian processes for Bayesian inverse problems associated with linear partial differential equations
- Residual Gaussian process: a tractable nonparametric Bayesian emulator for multi-fidelity simulations
- Output Fisher embedding regression
- Input output kernel regression: supervised and semi-supervised structured output prediction with operator-valued kernels
- Deep coregionalization for the emulation of simulation-based spatial-temporal fields
- Uncertainty modeling and propagation for groundwater flow: a comparative study of surrogates
- Bayesian optimization for policy search via online-offline experimentation
- Bayesian optimization of variable-size design space problems
- Surrogate Modeling with Gaussian Processes for an Inverse Problem in Polymer Dynamics
- Deep Gaussian process for multi-objective Bayesian optimization
- Kernel-based methods for vector-valued data with correlated components
- High-dimensional Bayesian optimization using low-dimensional feature spaces
- Just interpolate: kernel ``ridgeless regression can generalize
- Bayesian optimization with safety constraints: safe and automatic parameter tuning in robotics
- scientific article; zbMATH DE number 7370541 (Why is no real title available?)
- Multi-target prediction: a unifying view on problems and methods
- Reduced order models for many-query subsurface flow applications
- Experimental design for nonparametric correction of misspecified dynamical models
- Heteroscedastic Gaussian process regression for material structure-property relationship modeling
- Phase retrieval of complex and vector-valued functions
- Gaussian process hydrodynamics
- Intercorrelated random fields with bounds and the Bayesian identification of their parameters: Application to linear elastic struts and fibers
- Nonparametric Modeling and Prognosis of Condition Monitoring Signals Using Multivariate Gaussian Convolution Processes
- A review on statistical and machine learning competing risks methods
- Learning system parameters from Turing patterns
- A Bayesian optimization approach to find Nash equilibria
- Hilbert C∗-Module for Analyzing Structured Data
- Symplectic Gaussian process regression of maps in Hamiltonian systems
- Exact Bayesian Inference in Spatiotemporal Cox Processes Driven by Multivariate Gaussian Processes
- Convex optimization in sums of Banach spaces
- Kernel methods are competitive for operator learning
- Gaussian kernel with correlated variables for incomplete data
- Bayesian optimisation for constrained problems
- Dynamic mode decomposition in vector-valued reproducing kernel Hilbert spaces for extracting dynamical structure among observables
- A constrained matrix-variate Gaussian process for transposable data
- Symmetry exploits for Bayesian cubature methods
- Large scale multi-label learning using Gaussian processes
- Convergence rates for matrix P-greedy variants
- MAGMA: inference and prediction using multi-task Gaussian processes with common mean
- Multi-output learning via spectral filtering
- GFN: a graph feedforward network for resolution-invariant reduced operator learning in multifidelity applications
- Gradient-enhanced deep Gaussian processes for multifidelity modeling
- Ensembles for multi-target regression with random output selections
- Operator-valued kernel-based vector autoregressive models for network inference
- Multi-fidelity surrogate modeling using long short-term memory networks
- Machine learning-based multi-objective optimization for efficient identification of crystal plasticity model parameters
- Bayesian optimization of functional output in inverse problems
- A Riemannian gossip approach to subspace learning on Grassmann manifold
- Structured learning of rigid‐body dynamics: A survey and unified view from a robotics perspective
- On Learning Vector-Valued Functions
- Generalized probabilistic principal component analysis of correlated data
- Operator learning approach for the limited view problem in photoacoustic tomography
- Multi-fidelity regression using artificial neural networks: efficient approximation of parameter-dependent output quantities
- Varying-coefficient models for geospatial transfer learning
- Calibrate, emulate, sample
- scientific article; zbMATH DE number 7625160 (Why is no real title available?)
- From kernel methods to neural networks: a unifying variational formulation
- Large scale multi-output multi-class classification using Gaussian processes
- Neuronal spike train analysis using Gaussian process models
- On Negative Transfer and Structure of Latent Functions in Multioutput Gaussian Processes
- Mixture of multivariate Gaussian processes for classification of irregularly sampled satellite image time-series
- Operator learning with Gaussian processes
- Targeted adaptive design
- Multi-target regression via input space expansion: treating targets as inputs
- A Comparative Study of Pairwise Learning Methods Based on Kernel Ridge Regression
- Multivariate versus univariate Kriging metamodels for multi-response simulation models
- Cross-covariance functions for multivariate geostatistics
- A unifying representer theorem for inverse problems and machine learning
This page was built for publication: Kernels for vector-valued functions: a review
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2903301)