Optimal weighted least-squares methods
From MaRDI portal
Publication:4967355
weighted least squareserror analysisconvergence ratesrandom matricespolynomial approximationmultivariate approximationconditional sampling
Nonparametric estimation (62G05) Approximations to statistical distributions (nonasymptotic) (62E17) Multidimensional problems (41A63) Approximation by polynomials (41A10) Algorithms for approximation of functions (65D15) Least squares and related methods for stochastic control systems (93E24) Rate of convergence, degree of approximation (41A25)
Abstract: We consider the problem of reconstructing an unknown bounded function defined on a domain from noiseless or noisy samples of at points . We measure the reconstruction error in a norm for some given probability measure . Given a linear space with , we study in general terms the weighted least-squares approximations from the spaces based on independent random samples. The contribution of the present paper is twofold. From the theoretical perspective, we establish results in expectation and in probability for weighted least squares in general approximation spaces . These results show that for an optimal choice of sampling measure and weight , which depends on the space and on the measure , stability and optimal accuracy are achieved under the mild condition that scales linearly with up to an additional logarithmic factor. The present analysis covers also cases where the function and its approximants from are unbounded, which might occur for instance in the relevant case where and is the Gaussian measure. From the numerical perspective, we propose a sampling method which allows one to generate independent and identically distributed samples from the optimal measure . This method becomes of interest in the multivariate setting where is generally not of tensor product type. We illustrate this for particular examples of approximation spaces of polynomial type, where the domain is allowed to be unbounded and high or even infinite dimensional, motivated by certain applications to parametric and stochastic PDEs.
Recommendations
- Adaptive approximation by optimal weighted least-squares methods
- On the stability and accuracy of least squares approximations
- Sequential sampling for optimal weighted least squares approximations in hierarchical spaces
- Optimal sampling and Christoffel functions on general domains
- Weighted approximate Fekete points: sampling for least-squares polynomial approximation
Cites work
- scientific article; zbMATH DE number 3954145 (Why is no real title available?)
- scientific article; zbMATH DE number 1077997 (Why is no real title available?)
- A Christoffel function weighted least squares algorithm for collocation approximations
- A generalized sampling theorem for stable reconstructions in arbitrary bases
- Analysis of discrete L^2 projection on polynomial spaces with random evaluations
- Coherence motivated sampling and convergence analysis of least squares polynomial chaos regression
- Convergence estimates in probability and in expectation for discrete least squares with noisy evaluations at random points
- Discrete least squares polynomial approximation with random evaluations - application to parametric and stochastic elliptic PDEs
- Géza Freud, orthogonal polynomials and Christoffel functions. A case study
- Multivariate Markov-type and Nikolskii-type inequalities for polynomials associated with downward closed multi-index sets
- On the stability and accuracy of least squares approximations
- Sampling and reconstruction of solutions to the Helmholtz equation
- Szegö's extremum problem on the unit circle
- User-friendly tail bounds for sums of random matrices
Cited in
(93)- scientific article; zbMATH DE number 3947502 (Why is no real title available?)
- scientific article; zbMATH DE number 19013 (Why is no real title available?)
- Polynomial chaos expansions for dependent random variables
- APPROXIMATING SMOOTH, MULTIVARIATE FUNCTIONS ON IRREGULAR DOMAINS
- scientific article; zbMATH DE number 7404279 (Why is no real title available?)
- Recovery guarantees for polynomial coefficients from weakly dependent data with outliers
- A probabilistic reduced basis method for parameter-dependent problems
- Function values are enough for \(L_2\)-approximation
- Bounds on Kolmogorov widths and sampling recovery for classes with small mixed smoothness
- CAS4DL: Christoffel adaptive sampling for function approximation via deep learning
- Towards stability results for global radial basis function based quadrature formulas
- Sparse polynomial chaos expansions: literature survey and benchmark
- PLS-based adaptation for efficient PCE representation in high dimensions
- Optimal sampling and Christoffel functions on general domains
- Variable transformations in combination with wavelets and ANOVA for high-dimensional approximation
- On the reconstruction of functions from values at subsampled quadrature points
- Stable approximation of Helmholtz solutions in the disk by evanescent plane waves
- Multivariate approximation in downward closed polynomial spaces
- Randomized least-squares with minimal oversampling and interpolation in general spaces
- Machine learning with kernels for portfolio valuation and risk management
- Boosted optimal weighted least-squares
- On optimal recovery in \(L_2\)
- Weighted approximate Fekete points: sampling for least-squares polynomial approximation
- The Random Feature Model for Input-Output Maps between Banach Spaces
- Randomized numerical linear algebra: Foundations and algorithms
- Operator learning using random features: a tool for scientific computing
- Constructive subsampling of finite frames with applications in optimal function recovery
- Sequential sampling for optimal weighted least squares approximations in hierarchical spaces
- Low-rank tensor reconstruction of concentrated densities with application to Bayesian inversion
- Weighted least squares fitting using ordinary least squares algorithms
- Fast hyperbolic wavelet regression meets ANOVA
- Towards optimal sampling for learning sparse approximation in high dimensions
- Variational Monte Carlo -- bridging concepts of machine learning and high-dimensional partial differential equations
- ``Regression anytime with brute-force SVD truncation
- Optimality and regularization properties of quasi-interpolation: deterministic and stochastic approaches
- Least squares approximations of measures via geometric condition numbers
- Approximative policy iteration for exit time feedback control problems driven by stochastic differential equations using tensor train format
- An Adaptive Sampling and Domain Learning Strategy for Multivariate Function Approximation on Unknown Domains
- Near-optimal approximation methods for elliptic PDEs with lognormal coefficients
- Lower bounds for integration and recovery in \(L_2\)
- SeAr PC: sensitivity enhanced arbitrary polynomial chaos
- Geometric computation of Christoffel functions on planar convex domains
- Error guarantees for least squares approximation with noisy samples in domain adaptation
- Stable high-order randomized cubature formulae in arbitrary dimension
- A note on sampling recovery of multivariate functions in the uniform norm
- Constructing least-squares polynomial approximations
- On the power of standard information for tractability for \(L_{\infty}\) approximation of periodic functions in the worst case setting
- Random points are good for universal discretization
- Randomized weakly admissible meshes
- Convergence estimates in probability and in expectation for discrete least squares with noisy evaluations at random points
- Discrete least-squares approximations over optimized downward closed polynomial spaces in arbitrary dimension
- Correcting for unknown errors in sparse high-dimensional function approximation
- Multilevel weighted least squares polynomial approximation
- Near-optimal sampling strategies for multivariate function approximation on general domains
- A Weighted Least-Squares Approach to Parameter Estimation Problems Based on Binary Measurements
- Convergence bounds for empirical nonlinear least-squares
- Optimal Monte Carlo methods for \(L^2\)-approximation
- Optimal pointwise sampling for \(L^2\) approximation
- On the stability and accuracy of the empirical interpolation method and gravitational wave surrogates
- On a near optimal sampling strategy for least squares polynomial regression
- One-sided discretization inequalities and sampling recovery
- Least squares approximation of polynomial chaos expansions with optimized grid points
- \(L_2\)-norm sampling discretization and recovery of functions from RKHS with finite trace
- Polynomial surrogates for Bayesian traveltime tomography
- A new upper bound for sampling numbers
- Sample complexity bounds for the local convergence of least squares approximation
- Influence of sampling on the convergence rates of greedy algorithms for parameter-dependent random variables
- Worst-case recovery guarantees for least squares approximation using random samples
- Projection pursuit adaptation on polynomial chaos expansions
- Perturbations of Christoffel-Darboux kernels: detection of outliers
- Sampling, Marcinkiewicz-Zygmund inequalities, approximation, and quadrature rules
- Sampling discretization and related problems
- A Stieltjes Algorithm for Generating Multivariate Orthogonal Polynomials
- Physics-informed polynomial chaos expansions
- An improved quantum algorithm for data fitting
- Multifidelity uncertainty quantification with models based on dissimilar parameters
- A multi-fidelity polynomial chaos-greedy Kaczmarz approach for resource-efficient uncertainty quantification on limited budget
- scientific article; zbMATH DE number 1001372 (Why is no real title available?)
- Numerical realization of the Mortensen observer via a Hessian-augmented polynomial approximation of the value function
- Adaptive Nonintrusive Reconstruction of Solutions to High-Dimensional Parametric PDEs
- A sharp upper bound for sampling numbers in \(L_2\)
- Semi-Infinite Linear Regression and Its Applications
- On the power of standard information for tractability for \(L_2\)-approximation in the average case setting
- Adaptive approximation by optimal weighted least-squares methods
- Non-intrusive framework of reduced-order modeling based on proper orthogonal decomposition and polynomial chaos expansion
- Sampling discretization of the uniform norm
- Accurate solution of weighted least squares by iterative methods
- Applied harmonic analysis and data processing. Abstracts from the workshop held March 25--31, 2018
- Function values are enough for \(L_2\)-approximation. II
- Variance-based adaptive sequential sampling for polynomial chaos expansion
- Adaptive weighted least-squares polynomial chaos expansion with basis adaptivity and sequential adaptive sampling
- Compressive Hermite interpolation: sparse, high-dimensional approximation from gradient-augmented measurements
- Computation of induced orthogonal polynomial distributions
This page was built for publication: Optimal weighted least-squares methods
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4967355)