Kernels for vector-valued functions: a review
From MaRDI portal
Publication:2903301
DOI10.1561/2200000036zbMATH Open1301.68212arXiv1106.6251OpenAlexW4206212643WikidataQ57831467 ScholiaQ57831467MaRDI QIDQ2903301FDOQ2903301
Authors: Mauricio A. Álvarez, Lorenzo Rosasco, Neil D. Lawrence
Publication date: 8 August 2012
Published in: Foundations and Trends in Machine Learning (Search for Journal in Brave)
Abstract: Kernel methods are among the most popular techniques in machine learning. From a frequentist/discriminative perspective they play a central role in regularization theory as they provide a natural choice for the hypotheses space and the regularization functional through the notion of reproducing kernel Hilbert spaces. From a Bayesian/generative perspective they are the key in the context of Gaussian processes, where the kernel function is also known as the covariance function. Traditionally, kernel methods have been used in supervised learning problem with scalar outputs and indeed there has been a considerable amount of work devoted to designing and learning kernels. More recently there has been an increasing interest in methods that deal with multiple outputs, motivated partly by frameworks like multitask learning. In this paper, we review different methods to design or learn valid kernel functions for multiple outputs, paying particular attention to the connection between probabilistic and functional methods.
Full work available at URL: https://arxiv.org/abs/1106.6251
Recommendations
Cited In (81)
- Grouped Gaussian processes for solar power prediction
- Data-driven model order reduction for problems with parameter-dependent jump-discontinuities
- Computationally efficient convolved multiple output Gaussian processes
- GParareal: a time-parallel ODE solver using Gaussian process emulation
- Two-Layer Neural Networks with Values in a Banach Space
- Do ideas have shape? Idea registration as the continuous limit of artificial neural networks
- Multi-task learning in vector-valued reproducing kernel Banach spaces with the \(\ell^1\) norm
- A representer theorem for deep neural networks
- A non-intrusive multifidelity method for the reduced order modeling of nonlinear problems
- Residual Gaussian process: a tractable nonparametric Bayesian emulator for multi-fidelity simulations
- Output Fisher embedding regression
- Input output kernel regression: supervised and semi-supervised structured output prediction with operator-valued kernels
- Deep coregionalization for the emulation of simulation-based spatial-temporal fields
- Bayesian optimization for policy search via online-offline experimentation
- Bayesian optimization of variable-size design space problems
- Kernel-based methods for vector-valued data with correlated components
- High-dimensional Bayesian optimization using low-dimensional feature spaces
- Just interpolate: kernel ``ridgeless regression can generalize
- Title not available (Why is that?)
- Multi-target prediction: a unifying view on problems and methods
- Experimental design for nonparametric correction of misspecified dynamical models
- Gaussian process hydrodynamics
- Reduced order models for many-query subsurface flow applications
- Phase retrieval of complex and vector-valued functions
- A Bayesian optimization approach to find Nash equilibria
- Symplectic Gaussian process regression of maps in Hamiltonian systems
- Exact Bayesian Inference in Spatiotemporal Cox Processes Driven by Multivariate Gaussian Processes
- Convex optimization in sums of Banach spaces
- Dynamic mode decomposition in vector-valued reproducing kernel Hilbert spaces for extracting dynamical structure among observables
- A constrained matrix-variate Gaussian process for transposable data
- Symmetry exploits for Bayesian cubature methods
- Convergence rates for matrix P-greedy variants
- Large scale multi-label learning using Gaussian processes
- Ensembles for multi-target regression with random output selections
- MAGMA: inference and prediction using multi-task Gaussian processes with common mean
- Multi-output learning via spectral filtering
- Multi-fidelity surrogate modeling using long short-term memory networks
- Machine learning-based multi-objective optimization for efficient identification of crystal plasticity model parameters
- Structured learning of rigid‐body dynamics: A survey and unified view from a robotics perspective
- Bayesian optimization of functional output in inverse problems
- Operator-valued kernel-based vector autoregressive models for network inference
- A Riemannian gossip approach to subspace learning on Grassmann manifold
- On Learning Vector-Valued Functions
- Generalized probabilistic principal component analysis of correlated data
- Operator learning approach for the limited view problem in photoacoustic tomography
- Multi-fidelity regression using artificial neural networks: efficient approximation of parameter-dependent output quantities
- Large scale multi-output multi-class classification using Gaussian processes
- Calibrate, emulate, sample
- Varying-coefficient models for geospatial transfer learning
- Neuronal spike train analysis using Gaussian process models
- Mixture of multivariate Gaussian processes for classification of irregularly sampled satellite image time-series
- A Comparative Study of Pairwise Learning Methods Based on Kernel Ridge Regression
- Multi-target regression via input space expansion: treating targets as inputs
- Multivariate versus univariate Kriging metamodels for multi-response simulation models
- Cross-covariance functions for multivariate geostatistics
- A unifying representer theorem for inverse problems and machine learning
- Learning nonparametric ordinary differential equations from noisy data
- A Bayesian decision framework for optimizing sequential combination antiretroviral therapy in people with HIV
- Bayesian estimation of large-scale simulation models with Gaussian process regression surrogates
- Gaussian processes for Bayesian inverse problems associated with linear partial differential equations
- Uncertainty modeling and propagation for groundwater flow: a comparative study of surrogates
- Surrogate Modeling with Gaussian Processes for an Inverse Problem in Polymer Dynamics
- Deep Gaussian process for multi-objective Bayesian optimization
- Bayesian optimization with safety constraints: safe and automatic parameter tuning in robotics
- Heteroscedastic Gaussian process regression for material structure-property relationship modeling
- Intercorrelated random fields with bounds and the Bayesian identification of their parameters: Application to linear elastic struts and fibers
- Nonparametric Modeling and Prognosis of Condition Monitoring Signals Using Multivariate Gaussian Convolution Processes
- A review on statistical and machine learning competing risks methods
- Learning system parameters from Turing patterns
- Hilbert C∗-Module for Analyzing Structured Data
- Kernel methods are competitive for operator learning
- Gaussian kernel with correlated variables for incomplete data
- Bayesian optimisation for constrained problems
- GFN: a graph feedforward network for resolution-invariant reduced operator learning in multifidelity applications
- Gradient-enhanced deep Gaussian processes for multifidelity modeling
- Title not available (Why is that?)
- From kernel methods to neural networks: a unifying variational formulation
- On Negative Transfer and Structure of Latent Functions in Multioutput Gaussian Processes
- Operator learning with Gaussian processes
- Targeted adaptive design
- Interpolation with uncoupled separable matrix-valued kernels
This page was built for publication: Kernels for vector-valued functions: a review
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2903301)