Polynomial approximation via compressed sensing of high-dimensional functions on lower sets

From MaRDI portal
Publication:4605704

DOI10.1090/mcom/3272OpenAlexW2284492459MaRDI QIDQ4605704

No author found.

Publication date: 27 February 2018

Published in: Mathematics of Computation (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1602.05823



Related Items

Infinite-dimensional \(\ell ^1\) minimization and function approximation from pointwise data, Discrete least-squares approximations over optimized downward closed polynomial spaces in arbitrary dimension, Near-Optimal Sampling Strategies for Multivariate Function Approximation on General Domains, Accelerating Stochastic Collocation Methods for Partial Differential Equations with Random Input Data, Hyperspherical Sparse Approximation Techniques for High-Dimensional Discontinuity Detection, Infinite-dimensional compressed sensing and function interpolation, On the strong convergence of forward-backward splitting in reconstructing jointly sparse signals, Adaptive group Lasso neural network models for functions of few variables and time-dependent data, Analysis of sparse recovery for Legendre expansions using envelope bound, A metalearning approach for physics-informed neural networks (PINNs): application to parameterized PDEs, Correcting for unknown errors in sparse high-dimensional function approximation, Optimal approximation of infinite-dimensional holomorphic functions, APPROXIMATING SMOOTH, MULTIVARIATE FUNCTIONS ON IRREGULAR DOMAINS, Constructing Least-Squares Polynomial Approximations, Do log factors matter? On optimal wavelet approximation and the foundations of compressed sensing, Flavors of Compressive Sensing, Compressed Sensing with Sparse Corruptions: Fault-Tolerant Sparse Collocation Approximations, Recovery guarantees for polynomial coefficients from weakly dependent data with outliers, A class of null space conditions for sparse recovery via nonconvex, non-separable minimizations, Towards optimal sampling for learning sparse approximation in high dimensions, A Compressive Spectral Collocation Method for the Diffusion Equation Under the Restricted Isometry Property, Sparse polynomial chaos expansions via compressed sensing and D-optimal design, An efficient and robust adaptive sampling method for polynomial chaos expansion in sparse Bayesian learning framework, A Gradient-Enhanced L1 Approach for the Recovery of Sparse Trigonometric Polynomials, Sparse harmonic transforms: a new class of sublinear-time algorithms for learning functions of many variables, Sparse recovery in bounded Riesz systems with applications to numerical methods for PDEs, Sparse harmonic transforms. II: Best \(s\)-term approximation guarantees for bounded orthonormal product bases in sublinear-time, Multivariate extensions of isotonic regression and total variation denoising via entire monotonicity and Hardy-Krause variation, A mixed 1 regularization approach for sparse simultaneous approximation of parameterized PDEs, Compressive Hermite interpolation: sparse, high-dimensional approximation from gradient-augmented measurements, The Gap between Theory and Practice in Function Approximation with Deep Neural Networks, GenMod: a generative modeling approach for spectral representation of PDEs with random inputs, Generalization bounds for sparse random feature expansions


Uses Software


Cites Work