Multivariate Regression and Machine Learning with Sums of Separable Functions

From MaRDI portal
Publication:3567008


DOI10.1137/070710524zbMath1190.62135MaRDI QIDQ3567008

Jochen Garcke, Gregory Beylkin, Martin J. Mohlenkamp

Publication date: 10 June 2010

Published in: SIAM Journal on Scientific Computing (Search for Journal in Brave)

Full work available at URL: https://semanticscholar.org/paper/916717cefb3e648bfa75f1264a2389febb8003e3


62H99: Multivariate analysis

62J99: Linear inference, regression

62J02: General nonlinear regression

68T05: Learning and adaptive systems in artificial intelligence

65D15: Algorithms for approximation of functions


Related Items

Range-Separated Tensor Format for Many-Particle Modeling, Interpretable Approximation of High-Dimensional Data, Fast computation of the multidimensional fractional Laplacian, Generalized Canonical Polyadic Tensor Decomposition, Sparse low-rank separated representation models for learning from data, Randomized Algorithms for Rounding in the Tensor-Train Format, HARFE: hard-ridge random feature expansion, Approximation of solutions to multidimensional parabolic equations by approximate approximations, Low-rank separated representation surrogates of high-dimensional stochastic functions: application in Bayesian inference, Discontinuous Legendre wavelet element method for elliptic partial differential equations, Fast high-dimensional approximation with sparse occupancy trees, \(O(d \log N)\)-quantics approximation of \(N\)-\(d\) tensors in high-dimensional numerical modeling, Polynomial meta-models with canonical low-rank approximations: numerical insights and comparison to sparse polynomial chaos expansions, Randomized interpolative decomposition of separated representations, A stochastic conjugate gradient method for the approximation of functions, Black box low tensor-rank approximation using fiber-crosses, A least-squares approximation of partial differential equations with high-dimensional random inputs, Orbit uncertainty propagation and sensitivity analysis with separated representations, Nearest-neighbor interaction systems in the tensor-train format, Optimization via separated representations and the canonical tensor decomposition, The numerical approximation of nonlinear functionals and functional differential equations, On the computational benefit of tensor separation for high-dimensional discrete convolutions, A continuous analogue of the tensor-train decomposition, Parallel tensor methods for high-dimensional linear PDEs, Sparse mixture models inspired by ANOVA decompositions, Neural-network based collision operators for the Boltzmann equation, Alternate algorithms to most referenced techniques of numerical optimization to solve the symmetric rank-\(R\) approximation problem of symmetric tensors, Tensor methods for the Boltzmann-BGK equation, Reduction of multivariate mixtures and its applications, Low-rank Riemannian eigensolver for high-dimensional Hamiltonians, Numerical methods for high-dimensional probability density function equations, On manifolds of tensors of fixed TT-rank, Non-intrusive low-rank separated approximation of high-dimensional stochastic models, Learning multivariate functions with low-dimensional structures using polynomial bases, A literature survey of low-rank tensor approximation techniques, Mixed discontinuous Legendre wavelet Galerkin method for solving elliptic partial differential equations, The Optimization Landscape for Fitting a Rank-2 Tensor with a Rank-1 Tensor, A Least-Squares Method for Sparse Low Rank Approximation of Multivariate Functions, Tensor Completion in Hierarchical Tensor Representations