Optimal sampling points in reproducing kernel Hilbert spaces
From MaRDI portal
Publication:272199
Abstract: The recent developments of basis pursuit and compressed sensing seek to extract information from as few samples as possible. In such applications, since the number of samples is restricted, one should deploy the sampling points wisely. We are motivated to study the optimal distribution of finite sampling points. Formulation under the framework of optimal reconstruction yields a minimization problem. In the discrete case, we estimate the distance between the optimal subspace resulting from a general Karhunen-Loeve transform and the kernel space to obtain another algorithm that is computationally favorable. Numerical experiments are then presented to illustrate the performance of the algorithms for the searching of optimal sampling points.
Recommendations
- Sampling analysis in the complex reproducing kernel Hilbert space
- Sampling and quasi-optimal approximation for signals in a reproducing kernel space of homogeneous type
- Sampling theory and reproducing kernel Hilbert spaces
- The general sampling theory by using reproducing kernels
- Nonuniform sampling, reproducing kernels, and the associated Hilbert spaces
Cites work
- scientific article; zbMATH DE number 4144656 (Why is no real title available?)
- scientific article; zbMATH DE number 44104 (Why is no real title available?)
- scientific article; zbMATH DE number 1391397 (Why is no real title available?)
- scientific article; zbMATH DE number 5055767 (Why is no real title available?)
- Atomic Decomposition by Basis Pursuit
- Bounds for Truncation Error of the Sampling Expansion
- Deterministic and stochastic error bounds in numerical analysis
- Extrapolation in variable RKHSs with application to the blood glucose reading
- Finite rank kernels for multi-task learning
- Frames, Riesz bases, and sampling expansions in Banach spaces via semi-inner products
- Function spaces for sampling expansions
- General sampling theorems for functions in reproducing kernel Hilbert spaces
- Learning Theory
- Learning the kernel function via regularization
- Lectures on Fourier Integrals. (AM-42)
- Linear information versus function evaluations for L₂-approximation
- Metric spaces and completely monontone functions
- On the mathematical foundations of learning
- On the power of standard information for multivariate approximation in the worst case setting
- On the power of standard information for weighted approximation
- On the regularized Whittaker-Kotel’nikov-Shannon sampling formula
- Optimal learning of bandlimited functions from localized sampling
- Regularization networks and support vector machines
- Reproducing kernel Banach spaces for machine learning
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Scattered Data Approximation
- The power of standard information for multivariate approximation in the randomized setting
- Theory of Reproducing Kernels
- Tractability of multivariate problems. Volume I: Linear information
- Tractability of multivariate problems. Volume II: Standard information for functionals.
- Universal kernels
Cited in
(5)- Generation of point sets by convex optimization for interpolation in reproducing kernel Hilbert spaces
- Effective methods for obtaining good points for quadrature in reproducing kernel Hilbert spaces
- A kernel view on manifold sub-sampling based on Karcher variance optimization
- Random sampling in reproducing kernel spaces with mixed norm
- Sampling and quasi-optimal approximation for signals in a reproducing kernel space of homogeneous type
This page was built for publication: Optimal sampling points in reproducing kernel Hilbert spaces
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q272199)