Interpolation via weighted _1 minimization
From MaRDI portal
Publication:905907
Abstract: Functions of interest are often smooth and sparse in some sense, and both priors should be taken into account when interpolating sampled data. Classical linear interpolation methods are effective under strong regularity assumptions, but cannot incorporate nonlinear sparsity structure. At the same time, nonlinear methods such as minimization can reconstruct sparse functions from very few samples, but do not necessarily encourage smoothness. Here we show that weighted minimization effectively merges the two approaches, promoting both sparsity and smoothness in reconstruction. More precisely, we provide specific choices of weights in the objective to achieve rates for functions with coefficient sequences in weighted spaces, . We consider the implications of these results for spherical harmonic and polynomial interpolation, in the univariate and multivariate setting. Along the way, we extend concepts from compressive sensing such as the restricted isometry property and null space property to accommodate weighted sparse expansions; these developments should be of independent interest in the study of structured sparse approximations and continuous-time compressive sensing problems.
Recommendations
- Infinite-dimensional \(\ell ^1\) minimization and function approximation from pointwise data
- Infinite-dimensional compressed sensing and function interpolation
- Robust recovery of a kind of weighted l1-minimization without noise level
- Weighted splines as optimal interpolants
- New conditions on stable recovery of weighted sparse signals via weighted l₁ minimization
Cites work
- scientific article; zbMATH DE number 432614 (Why is no real title available?)
- scientific article; zbMATH DE number 49190 (Why is no real title available?)
- A Bennett concentration inequality and its application to suprema of empirical processes
- A mathematical introduction to compressive sensing
- A short note on compressed sensing with partially known signal support
- A simple proof of the restricted isometry property for random matrices
- A weighted \(\ell_1\)-minimization approach for sparse polynomial chaos expansions
- An upper bound on Jacobi polynomials
- Analytic regularity and polynomial approximation of parametric and stochastic elliptic PDE's
- Analyzing Weighted $\ell_1$ Minimization for Sparse Recovery With Nonuniform Sparse Models
- Approximation in Sobolev spaces by kernel expansions
- Compressed sensing and best \(k\)-term approximation
- Compressive sensing and structured random matrices
- Convergence rates of best \(N\)-term Galerkin approximations for a class of elliptic SPDEs
- Decoding by Linear Programming
- Inequalities of Bernstein-Jackson-type and the degree of compactness of operators in Banach spaces
- Iterative hard thresholding for compressed sensing
- Lectures on Gaussian Processes
- Model-Based Compressive Sensing
- Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
- On approximate recovery of functions with bounded mixed derivative
- On sparse reconstruction from Fourier and Gaussian measurements
- On the stability and accuracy of least squares approximations
- Recovering Compressively Sampled Signals Using Partial Support Information
- Restricted isometry of Fourier matrices and list decodability of random linear codes
- Sampling Theorems for Signals From the Union of Finite-Dimensional Linear Subspaces
- Sampling on energy-norm based sparse grids for the optimal recovery of Sobolev type functions in H^
- Sparse Legendre expansions via _1-minimization
- Spectral Methods for Uncertainty Quantification
- Stability Results for Scattered Data Interpolation by Trigonometric Polynomials
- Suprema of chaos processes and the restricted isometry property
- Weighted eigenfunction estimates with applications to compressed sensing
Cited in
(60)- Recovery guarantees for polynomial coefficients from weakly dependent data with outliers
- Tight bounds on the mutual coherence of sensing matrices for Wigner d-functions on regular grids
- Nonlinear frames and sparse reconstructions in Banach spaces
- Generalization bounds for sparse random feature expansions
- Polynomial approximation via compressed sensing of high-dimensional functions on lower sets
- Compressed sensing with local structure: uniform recovery guarantees for the sparsity in levels class
- Recovery analysis for weighted mixed \(\ell_2 / \ell_p\) minimization with \(0 < p \leq 1\)
- On polynomial chaos expansion via gradient-enhanced \(\ell_1\)-minimization
- Infinite-dimensional compressed sensing and function interpolation
- Contractive symmetric matrix completion problems related to graphs
- Stable and robust $\ell_p$-constrained compressive sensing recovery via robust width property
- A gradient enhanced \(\ell_{1}\)-minimization for sparse approximation of polynomial chaos expansions
- Randomized numerical linear algebra: Foundations and algorithms
- Robust width: a characterization of uniformly stable and robust compressed sensing
- Sampling numbers of smoothness classes via \(\ell^1\)-minimization
- Compressive sensing Petrov-Galerkin approximation of high-dimensional parametric operator equations
- A theoretical study of compressed solving for advection-diffusion-reaction problems
- Minimum norm interpolation in the \(\ell_1(\mathbb{N})\) space
- Structure and Optimisation in Computational Harmonic Analysis: On Key Aspects in Sparse Regularisation
- Towards optimal sampling for learning sparse approximation in high dimensions
- Recovery analysis for weighted \(\ell_{1}\)-minimization using the null space property
- Infinite-dimensional \(\ell ^1\) minimization and function approximation from pointwise data
- Enhancing sparsity of Hermite polynomial expansions by iterative rotations
- A Gradient-Enhanced L1 Approach for the Recovery of Sparse Trigonometric Polynomials
- scientific article; zbMATH DE number 7370565 (Why is no real title available?)
- A general framework of rotational sparse approximation in uncertainty quantification
- Sparse recovery in bounded Riesz systems with applications to numerical methods for PDEs
- Reconstruction of sparse Legendre and Gegenbauer expansions
- Reconstruction of Sparse Polynomials via Quasi-Orthogonal Matching Pursuit Method
- Discrete least-squares approximations over optimized downward closed polynomial spaces in arbitrary dimension
- Correcting for unknown errors in sparse high-dimensional function approximation
- Compressed sensing of data with a known distribution
- Convergence bounds for empirical nonlinear least-squares
- Sparse approximation of multivariate functions from small datasets via weighted orthogonal matching pursuit
- Extracting Sparse High-Dimensional Dynamics from Limited Data
- A mixed ℓ1 regularization approach for sparse simultaneous approximation of parameterized PDEs
- Extracting Structured Dynamical Systems Using Sparse Optimization With Very Few Samples
- The Discrete Empirical Interpolation Method: Canonical Structure and Formulation in Weighted Inner Product Spaces
- Sufficient conditions on stable reconstruction of weighted problem
- Solving structured nonsmooth convex optimization with complexity \(\mathcal {O}(\varepsilon ^{-1/2})\)
- Stable recovery of weighted sparse signals from phaseless measurements via weighted l1 minimization
- Basis adaptive sample efficient polynomial chaos (BASE-PC)
- Sliced-Inverse-Regression--Aided Rotated Compressive Sensing Method for Uncertainty Quantification
- Analysis of sparse recovery for Legendre expansions using envelope bound
- Sample complexity bounds for the local convergence of least squares approximation
- The gap between theory and practice in function approximation with deep neural networks
- Worst-case recovery guarantees for least squares approximation using random samples
- The recovery guarantee for orthogonal matching pursuit method to reconstruct sparse polynomials
- Generalization error of minimum weighted norm and kernel interpolation
- Accelerating stochastic collocation methods for partial differential equations with random input data
- Robust recovery of a kind of weighted l1-minimization without noise level
- Improved bounds for sparse recovery from subsampled random convolutions
- On the Absence of Uniform Recovery in Many Real-World Applications of Compressed Sensing and the Restricted Isometry Property and Nullspace Property in Levels
- Compressive sensing with redundant dictionaries and structured measurements
- Compressed sensing with sparse corruptions: fault-tolerant sparse collocation approximations
- Tractability of sampling recovery on unweighted function classes
- New conditions on stable recovery of weighted sparse signals via weighted l₁ minimization
- A class of null space conditions for sparse recovery via nonconvex, non-separable minimizations
- Breaking the coherence barrier: a new theory for compressed sensing
- Compressive Hermite interpolation: sparse, high-dimensional approximation from gradient-augmented measurements
This page was built for publication: Interpolation via weighted \(\ell_{1}\) minimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q905907)