Fast matrix computations for functional additive models

From MaRDI portal
Publication:5963540

DOI10.1007/S11222-014-9490-0zbMATH Open1331.62027arXiv1402.4984OpenAlexW1992731132MaRDI QIDQ5963540FDOQ5963540

Simon Barthelmรฉ

Publication date: 22 February 2016

Published in: Statistics and Computing (Search for Journal in Brave)

Abstract: It is common in functional data analysis to look at a set of related functions: a set of learning curves, a set of brain signals, a set of spatial maps, etc. One way to express relatedness is through an additive model, whereby each individual function gileft(xight) is assumed to be a variation around some shared mean f(x). Gaussian processes provide an elegant way of constructing such additive models, but suffer from computational difficulties arising from the matrix operations that need to be performed. Recently Heersink & Furrer have shown that functional additive model give rise to covariance matrices that have a specific form they called quasi-Kronecker (QK), whose inverses are relatively tractable. We show that under additional assumptions the two-level additive model leads to a class of matrices we call restricted quasi-Kronecker, which enjoy many interesting properties. In particular, we formulate matrix factorisations whose complexity scales only linearly in the number of functions in latent field, an enormous improvement over the cubic scaling of na"ive approaches. We describe how to leverage the properties of rQK matrices for inference in Latent Gaussian Models.


Full work available at URL: https://arxiv.org/abs/1402.4984





Cites Work


Cited In (2)

Uses Software


Recommendations





This page was built for publication: Fast matrix computations for functional additive models

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5963540)