Kernels for vector-valued functions: a review
From MaRDI portal
Publication:2903301
Abstract: Kernel methods are among the most popular techniques in machine learning. From a frequentist/discriminative perspective they play a central role in regularization theory as they provide a natural choice for the hypotheses space and the regularization functional through the notion of reproducing kernel Hilbert spaces. From a Bayesian/generative perspective they are the key in the context of Gaussian processes, where the kernel function is also known as the covariance function. Traditionally, kernel methods have been used in supervised learning problem with scalar outputs and indeed there has been a considerable amount of work devoted to designing and learning kernels. More recently there has been an increasing interest in methods that deal with multiple outputs, motivated partly by frameworks like multitask learning. In this paper, we review different methods to design or learn valid kernel functions for multiple outputs, paying particular attention to the connection between probabilistic and functional methods.
Recommendations
Cited in
(81)- Multivariate versus univariate Kriging metamodels for multi-response simulation models
- Large scale multi-output multi-class classification using Gaussian processes
- Phase retrieval of complex and vector-valued functions
- Operator-valued kernel-based vector autoregressive models for network inference
- Computationally efficient convolved multiple output Gaussian processes
- Calibrate, emulate, sample
- High-dimensional Bayesian optimization using low-dimensional feature spaces
- Neuronal spike train analysis using Gaussian process models
- Cross-covariance functions for multivariate geostatistics
- Varying-coefficient models for geospatial transfer learning
- Output Fisher embedding regression
- Residual Gaussian process: a tractable nonparametric Bayesian emulator for multi-fidelity simulations
- Multi-target prediction: a unifying view on problems and methods
- Grouped Gaussian processes for solar power prediction
- Data-driven model order reduction for problems with parameter-dependent jump-discontinuities
- A Riemannian gossip approach to subspace learning on Grassmann manifold
- Bayesian optimization of variable-size design space problems
- GParareal: a time-parallel ODE solver using Gaussian process emulation
- Mixture of multivariate Gaussian processes for classification of irregularly sampled satellite image time-series
- Ensembles for multi-target regression with random output selections
- Experimental design for nonparametric correction of misspecified dynamical models
- A constrained matrix-variate Gaussian process for transposable data
- MAGMA: inference and prediction using multi-task Gaussian processes with common mean
- A Bayesian optimization approach to find Nash equilibria
- Input output kernel regression: supervised and semi-supervised structured output prediction with operator-valued kernels
- Operator learning approach for the limited view problem in photoacoustic tomography
- Multi-output learning via spectral filtering
- Do ideas have shape? Idea registration as the continuous limit of artificial neural networks
- A Comparative Study of Pairwise Learning Methods Based on Kernel Ridge Regression
- Symmetry exploits for Bayesian cubature methods
- A representer theorem for deep neural networks
- Gaussian process hydrodynamics
- A non-intrusive multifidelity method for the reduced order modeling of nonlinear problems
- Convex optimization in sums of Banach spaces
- A unifying representer theorem for inverse problems and machine learning
- Reduced order models for many-query subsurface flow applications
- Just interpolate: kernel ``ridgeless regression can generalize
- On Learning Vector-Valued Functions
- Multi-fidelity surrogate modeling using long short-term memory networks
- Machine learning-based multi-objective optimization for efficient identification of crystal plasticity model parameters
- Multi-fidelity regression using artificial neural networks: efficient approximation of parameter-dependent output quantities
- Convergence rates for matrix P-greedy variants
- Deep coregionalization for the emulation of simulation-based spatial-temporal fields
- Symplectic Gaussian process regression of maps in Hamiltonian systems
- scientific article; zbMATH DE number 7370541 (Why is no real title available?)
- Structured learning of rigid‐body dynamics: A survey and unified view from a robotics perspective
- Dynamic mode decomposition in vector-valued reproducing kernel Hilbert spaces for extracting dynamical structure among observables
- Generalized probabilistic principal component analysis of correlated data
- Kernel-based methods for vector-valued data with correlated components
- Bayesian optimization of functional output in inverse problems
- Bayesian optimization for policy search via online-offline experimentation
- Two-Layer Neural Networks with Values in a Banach Space
- Exact Bayesian Inference in Spatiotemporal Cox Processes Driven by Multivariate Gaussian Processes
- Large scale multi-label learning using Gaussian processes
- Multi-target regression via input space expansion: treating targets as inputs
- Multi-task learning in vector-valued reproducing kernel Banach spaces with the \(\ell^1\) norm
- Nonparametric Modeling and Prognosis of Condition Monitoring Signals Using Multivariate Gaussian Convolution Processes
- A review on statistical and machine learning competing risks methods
- Learning nonparametric ordinary differential equations from noisy data
- Heteroscedastic Gaussian process regression for material structure-property relationship modeling
- Interpolation with uncoupled separable matrix-valued kernels
- Gaussian processes for Bayesian inverse problems associated with linear partial differential equations
- GFN: a graph feedforward network for resolution-invariant reduced operator learning in multifidelity applications
- Kernel methods are competitive for operator learning
- Gradient-enhanced deep Gaussian processes for multifidelity modeling
- Learning system parameters from Turing patterns
- Bayesian estimation of large-scale simulation models with Gaussian process regression surrogates
- Bayesian optimization with safety constraints: safe and automatic parameter tuning in robotics
- scientific article; zbMATH DE number 7625160 (Why is no real title available?)
- On Negative Transfer and Structure of Latent Functions in Multioutput Gaussian Processes
- Hilbert C∗-Module for Analyzing Structured Data
- Gaussian kernel with correlated variables for incomplete data
- Bayesian optimisation for constrained problems
- A Bayesian decision framework for optimizing sequential combination antiretroviral therapy in people with HIV
- From kernel methods to neural networks: a unifying variational formulation
- Surrogate Modeling with Gaussian Processes for an Inverse Problem in Polymer Dynamics
- Deep Gaussian process for multi-objective Bayesian optimization
- Intercorrelated random fields with bounds and the Bayesian identification of their parameters: Application to linear elastic struts and fibers
- Operator learning with Gaussian processes
- Targeted adaptive design
- Uncertainty modeling and propagation for groundwater flow: a comparative study of surrogates
This page was built for publication: Kernels for vector-valued functions: a review
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2903301)