Correcting for unknown errors in sparse high-dimensional function approximation
From MaRDI portal
Publication:2424851
DOI10.1007/s00211-019-01051-9zbMath1415.65030arXiv1711.07622OpenAlexW2963183022WikidataQ127937698 ScholiaQ127937698MaRDI QIDQ2424851
Anyi Bao, Ben Adcock, Simone Brugiapaglia
Publication date: 25 June 2019
Published in: Numerische Mathematik (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1711.07622
Approximation by polynomials (41A10) Algorithms for approximation of functions (65D15) Sampling theory in information and communication theory (94A20)
Related Items (14)
Near-Optimal Sampling Strategies for Multivariate Function Approximation on General Domains ⋮ WARPd: A Linearly Convergent First-Order Primal-Dual Algorithm for Inverse Problems with Approximate Sharpness Conditions ⋮ A novel regularization based on the error function for sparse recovery ⋮ A convergent iterated quasi-interpolation for periodic domain and its applications to surface PDEs ⋮ Unnamed Item ⋮ LASSO Reloaded: A Variational Analysis Perspective with Applications to Compressed Sensing ⋮ Optimal learning ⋮ Do log factors matter? On optimal wavelet approximation and the foundations of compressed sensing ⋮ Towards optimal sampling for learning sparse approximation in high dimensions ⋮ Error Localization of Best $L_{1}$ Polynomial Approximants ⋮ Sparse recovery in bounded Riesz systems with applications to numerical methods for PDEs ⋮ A mixed ℓ1 regularization approach for sparse simultaneous approximation of parameterized PDEs ⋮ Compressive Hermite interpolation: sparse, high-dimensional approximation from gradient-augmented measurements ⋮ The Gap between Theory and Practice in Function Approximation with Deep Neural Networks
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- Estimation and testing under sparsity. École d'Été de Probabilités de Saint-Flour XLV -- 2015
- Reweighted \(\ell_1\) minimization method for stochastic elliptic differential equations
- A weighted \(\ell_1\)-minimization approach for sparse polynomial chaos expansions
- A mathematical introduction to compressive sensing
- The adaptive Lasso in high-dimensional sparse heteroscedastic models
- High-dimensional adaptive sparse polynomial interpolation and applications to parametric PDEs
- Analysis of discrete \(L^2\) projection on polynomial spaces with random evaluations
- Weighted LAD-LASSO method for robust parameter estimation and variable selection in regression
- Breaking the curse of dimensionality in sparse polynomial approximation of parametric PDEs
- A non-adapted sparse approximation of PDEs with stochastic inputs
- Convergence rates of best \(N\)-term Galerkin approximations for a class of elliptic SPDEs
- Enhancing \(\ell_1\)-minimization estimates of polynomial chaos expansions using basis selection
- Enhancing sparsity by reweighted \(\ell _{1}\) minimization
- Simultaneous estimation and variable selection in median regression using Lasso-type penalty
- Interpolation via weighted \(\ell_{1}\) minimization
- A survey of cross-validation procedures for model selection
- Solvability problems of bivariate interpolation I
- Infinite-dimensional compressed sensing and function interpolation
- Compressed sensing and matrix completion with constant proportion of corruptions
- Multivariate polynomial interpolation on lower sets
- Pivotal estimation via square-root lasso in nonparametric regression
- Infinite-dimensional \(\ell ^1\) minimization and function approximation from pointwise data
- On multivariate polynomial interpolation
- Correcting Data Corruption Errors for Multivariate Function Approximation
- Compressive sensing Petrov-Galerkin approximation of high-dimensional parametric operator equations
- BREAKING THE COHERENCE BARRIER: A NEW THEORY FOR COMPRESSED SENSING
- Exact Recoverability From Dense Corrupted Observations via $\ell _{1}$-Minimization
- Stochastic Spectral Galerkin and Collocation Methods for PDEs with Random Coefficients: A Numerical Comparison
- Graph Implementations for Nonsmooth Convex Programs
- ANALYTIC REGULARITY AND POLYNOMIAL APPROXIMATION OF PARAMETRIC AND STOCHASTIC ELLIPTIC PDE'S
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Scaled sparse linear regression
- Hierarchical Tensor Approximation of Output Quantities of Parameter-Dependent PDEs
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Signal Recovery and the Large Sieve
- Selective inference with unknown variance via the square-root lasso
- Polynomial approximation via compressed sensing of high-dimensional functions on lower sets
- Compressed Sensing with Sparse Corruptions: Fault-Tolerant Sparse Collocation Approximations
- Sharp Oracle Inequalities for Square Root Regularization
- Robustness to Unknown Error in Sparse Regularization
- New development in freefem++
- Sparse Polynomial Approximation of High-Dimensional Functions
- Discrete least squares polynomial approximation with random evaluations − application to parametric and stochastic elliptic PDEs
- On the Absence of Uniform Recovery in Many Real-World Applications of Compressed Sensing and the Restricted Isometry Property and Nullspace Property in Levels
- Recovery of Sparsely Corrupted Signals
- Recovering Compressively Sampled Signals Using Partial Support Information
- Dense Error Correction Via $\ell^1$-Minimization
- The Group Square-Root Lasso: Theoretical Properties and Fast Algorithms
- STOCHASTIC COLLOCATION ALGORITHMS USING l1-MINIMIZATION
- Compressed sensing
This page was built for publication: Correcting for unknown errors in sparse high-dimensional function approximation