Multivariate regression and machine learning with sums of separable functions
From MaRDI portal
Publication:3567008
Recommendations
- Learning to Predict Physical Properties using Sums of Separable Functions
- Sparse low-rank separated representation models for learning from data
- Learning multivariate functions with low-dimensional structures using polynomial bases
- Product and sum separable functions
- Multivariate adaptive regression splines
Cited in
(42)- The numerical approximation of nonlinear functionals and functional differential equations
- Orbit uncertainty propagation and sensitivity analysis with separated representations
- Reduction of multivariate mixtures and its applications
- Learning multivariate functions with low-dimensional structures using polynomial bases
- Low-rank Riemannian eigensolver for high-dimensional Hamiltonians
- Fast high-dimensional approximation with sparse occupancy trees
- A least-squares method for sparse low rank approximation of multivariate functions
- Discontinuous Legendre wavelet element method for elliptic partial differential equations
- A continuous analogue of the tensor-train decomposition
- Learning to Predict Physical Properties using Sums of Separable Functions
- HARFE: hard-ridge random feature expansion
- Sparse low-rank separated representation models for learning from data
- A least-squares approximation of partial differential equations with high-dimensional random inputs
- Polynomial meta-models with canonical low-rank approximations: numerical insights and comparison to sparse polynomial chaos expansions
- Randomized interpolative decomposition of separated representations
- Tensor methods for the Boltzmann-BGK equation
- On manifolds of tensors of fixed TT-rank
- The optimization landscape for fitting a rank-2 tensor with a rank-1 tensor
- Approximation of solutions to multidimensional parabolic equations by approximate approximations
- Nearest-neighbor interaction systems in the tensor-train format
- \(O(d \log N)\)-quantics approximation of \(N\)-\(d\) tensors in high-dimensional numerical modeling
- Interpretable approximation of high-dimensional data
- Numerical methods for high-dimensional probability density function equations
- Black box low tensor-rank approximation using fiber-crosses
- Multi-output regression on the output manifold
- Randomized Algorithms for Rounding in the Tensor-Train Format
- A stochastic conjugate gradient method for the approximation of functions
- Tensor completion in hierarchical tensor representations
- On the computational benefit of tensor separation for high-dimensional discrete convolutions
- Low-rank separated representation surrogates of high-dimensional stochastic functions: application in Bayesian inference
- Generalized canonical polyadic tensor decomposition
- A literature survey of low-rank tensor approximation techniques
- Sparse mixture models inspired by ANOVA decompositions
- Optimization via separated representations and the canonical tensor decomposition
- Mixed discontinuous Legendre wavelet Galerkin method for solving elliptic partial differential equations
- Range-separated tensor format for many-particle modeling
- An alternating shifted higher order power method based algorithm for rank-\(R\) Hermitian approximation and solving Hermitian CP-decomposition problems
- Non-intrusive low-rank separated approximation of high-dimensional stochastic models
- Neural-network based collision operators for the Boltzmann equation
- Alternate algorithms to most referenced techniques of numerical optimization to solve the symmetric rank-\(R\) approximation problem of symmetric tensors
- Parallel tensor methods for high-dimensional linear PDEs
- Fast computation of the multidimensional fractional Laplacian
This page was built for publication: Multivariate regression and machine learning with sums of separable functions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3567008)