Learning non-parametric basis independent models from point queries via low-rank methods
From MaRDI portal
Publication:741260
DOI10.1016/j.acha.2014.01.002zbMath1373.68336arXiv1310.1826OpenAlexW2012411663MaRDI QIDQ741260
Publication date: 11 September 2014
Published in: Applied and Computational Harmonic Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1310.1826
nonlinear approximationlow-rank matrix recoveryhigh-dimensional function approximationmulti-ridge functionsoracle-based learning
Learning and adaptive systems in artificial intelligence (68T05) Approximation by arbitrary nonlinear expressions; widths and entropy (41A46)
Related Items
Certified dimension reduction in nonlinear Bayesian inverse problems ⋮ Multifidelity Dimension Reduction via Active Subspaces ⋮ Global optimization using random embeddings ⋮ Bound-constrained global optimization of functions with low effective dimensionality using multiple random embeddings ⋮ The recovery of ridge functions on the hypercube suffers from the curse of dimensionality ⋮ Matrix recipes for hard thresholding methods ⋮ Data-Driven Polynomial Ridge Approximation Using Variable Projection ⋮ Embedded ridge approximations ⋮ On recovery of regular ridge functions ⋮ Dimension Reduction via Gaussian Ridge Functions ⋮ Inverse regression for ridge recovery: a data-driven approach for parameter reduction in computer experiments ⋮ A near-stationary subspace for ridge approximation ⋮ Gaussian Quadrature and Polynomial Approximation for One-Dimensional Ridge Functions ⋮ Recovery of regular ridge functions on the ball ⋮ On two continuum armed bandit problems in high dimensions
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Learning functions of few arbitrary linear parameters in high dimensions
- Sparsity in multiple kernel learning
- On almost linearity of low dimensional projections from high dimensional data
- Component selection and smoothing in multivariate nonparametric regression
- Projection-based approximation and a duality with kernel methods
- High-dimensional additive modeling
- Projection pursuit
- Optimal reconstruction of a function from its projections
- Harmonic analysis of neural networks
- Ridgelets: estimating with ridge functions
- Adaptive estimation of a quadratic functional by model selection.
- Approximation of infinitely differentiable multivariate functions is intractable
- Capturing ridge functions in high dimensions from point queries
- Exact matrix completion via convex optimization
- Robust principal component analysis?
- Minimal Errors for Strong and Weak Approximation of Stochastic Differential Equations
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- Sliced Inverse Regression for Dimension Reduction
- Rank one perturbation and its application to the laplacian spectrum of a graph∗
- Sparse Additive Models
- An Adaptive Estimation of Dimension Reduction Space
- Ridgelets: a key to higher-dimensional intermittency?
- ADMiRA: Atomic Decomposition for Minimum Rank Approximation
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- A Multiple-Index Model and Dimension Reduction
- Perturbation bounds in connection with singular value decomposition