Interpolating splines on graphs for data science applications (Q778025): Difference between revisions

From MaRDI portal
Changed an Item
Import240304020342 (talk | contribs)
Set profile property.
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank

Revision as of 01:12, 5 March 2024

scientific article
Language Label Description Also known as
English
Interpolating splines on graphs for data science applications
scientific article

    Statements

    Interpolating splines on graphs for data science applications (English)
    0 references
    0 references
    0 references
    0 references
    30 June 2020
    0 references
    Interpolation in many variables is an established tool in numerical analysis and approximation theory. In this context, it is especially important to identify properties of Lagrange functions that, given some data points, are each 1 at just one point and vanish at all the others. (The idea comes, of course, from univariate Lagrange interpolation by polynomials.) This is because (i) they allow a particularly useful, easy formulation of the approximant , and (ii) the study of their properties give important insights into the qualities of our method -- localness (reflected by decay of Lagrange functions), stability (estimating Lebesgue numbers), convergence properties (reproduction of some finite dimensional subset, usually polynomials of some degree). Good examples are based on Lagrange functions based on kernels, radial basis functions, polyharmonic splines (note that here, for example, localness is guaranteed by the Lagrange functions' exponential decay). In this papers the authors pursue exactly that track, but the Lagrange functions and their 0/1 values are defined on \textit{graphs} (defined by edges and vertices) rather than points. All other features of the \textit{Ansatz} described can be carried through in the same way as mentioned above: easy formulation of interpolants on \textbf{graphs} instead of points, localness etc. This is established in much detail, compared with the classical approaches and results from radial basis function approximation, and applications in useful, modern fields like machine learning are given.
    0 references
    local basis functions
    0 references
    kernel-based machine learning
    0 references
    interpolation
    0 references
    Lagrange functions
    0 references

    Identifiers