Recovery guarantees for polynomial coefficients from weakly dependent data with outliers
From MaRDI portal
Publication:2209293
Abstract: Learning non-linear systems from noisy, limited, and/or dependent data is an important task across various scientific fields including statistics, engineering, computer science, mathematics, and many more. In general, this learning task is ill-posed; however, additional information about the data's structure or on the behavior of the unknown function can make the task well-posed. In this work, we study the problem of learning nonlinear functions from corrupted and dependent data. The learning problem is recast as a sparse robust linear regression problem where we incorporate both the unknown coefficients and the corruptions in a basis pursuit framework. The main contribution of our paper is to provide a reconstruction guarantee for the associated -optimization problem where the sampling matrix is formed from dependent data. Specifically, we prove that the sampling matrix satisfies the null space property and the stable null space property, provided that the data is compact and satisfies a suitable concentration inequality. We show that our recovery results are applicable to various types of dependent data such as exponentially strongly -mixing data, geometrically -mixing data, and uniformly ergodic Markov chain. Our theoretical results are verified via several numerical simulations.
Recommendations
- SPARSE AND ROBUST LINEAR REGRESSION: AN OPTIMIZATION ALGORITHM AND ITS STATISTICAL PROPERTIES
- Learning sparse polynomial functions
- Correcting data corruption errors for multivariate function approximation
- Signal recovery from incomplete measurements in the presence of outliers
- Learning non-parametric basis independent models from point queries via low-rank methods
Cites work
- scientific article; zbMATH DE number 4105912 (Why is no real title available?)
- scientific article; zbMATH DE number 3551792 (Why is no real title available?)
- scientific article; zbMATH DE number 778109 (Why is no real title available?)
- scientific article; zbMATH DE number 5204938 (Why is no real title available?)
- A Bernstein-type inequality for some mixing processes and dynamical systems with an application to learning
- A CENTRAL LIMIT THEOREM AND A STRONG MIXING CONDITION
- A mathematical introduction to compressive sensing
- Approximation of infinitely differentiable multivariate functions is intractable
- Compressed sensing and matrix completion with constant proportion of corruptions
- Compressed sensing with sparse corruptions: fault-tolerant sparse collocation approximations
- Corrupted Sensing: Novel Guarantees for Separating Structured Signals
- Deep learning
- Discovering governing equations from data by sparse identification of nonlinear dynamical systems
- EXPONENTIAL INEQUALITIES AND FUNCTIONAL ESTIMATIONS FOR WEAK DEPENDENT DATA: APPLICATIONS TO DYNAMICAL SYSTEMS
- Effectively Well-Conditioned Linear Systems
- Empirical Processes with Applications to Statistics
- Extracting Sparse High-Dimensional Dynamics from Limited Data
- Fast learning from \(\alpha\)-mixing observations
- Generalization and Robustness of Batched Weighted Average Algorithm with V-Geometrically Ergodic Markov Data
- Hidden physics models: machine learning of nonlinear partial differential equations
- High-dimensional adaptive sparse polynomial interpolation and applications to parametric PDEs
- Interpolation via weighted \(\ell_{1}\) minimization
- Learning from dependent observations
- Learning from non-iid data: fast rates for the one-vs-all multiclass plug-in classifiers
- Learning functions of few arbitrary linear parameters in high dimensions
- Learning partial differential equations via data discovery and sparse optimization
- Machine learning: Trends, perspectives, and prospects
- Minimum complexity regression estimation with weakly dependent observations
- Modeling and nonlinear parameter estimation with Kronecker product representation for coupled oscillators and spatiotemporal systems
- Nonlinear Regression with Dependent Observations
- On the convergence of the SINDy algorithm
- Optimal weighted least-squares methods
- Pattern recognition for conditionally independent data
- Polynomial approximation via compressed sensing of high-dimensional functions on lower sets
- Proximal splitting methods in signal processing
- Regularization algorithms for learning that are equivalent to multilayer networks
- Sequential function approximation with noisy data
- Sparse identification of nonlinear dynamics for model predictive control in the low-data limit
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- System identification in the presence of outliers and random noises: a compressed sensing approach
- The elements of statistical learning. Data mining, inference, and prediction
- Uniform Recovery Bounds for Structured Random Matrices in Corrupted Compressed Sensing
Cited in
(7)- Learning from non-random data in Hilbert spaces: an optimal recovery perspective
- Correcting data corruption errors for multivariate function approximation
- Towards optimal sampling for learning sparse approximation in high dimensions
- Model selection of chaotic systems from data with hidden variables using sparse data assimilation
- A generalization bound of deep neural networks for dependent data
- Extracting Structured Dynamical Systems Using Sparse Optimization With Very Few Samples
- Adaptive group Lasso neural network models for functions of few variables and time-dependent data
This page was built for publication: Recovery guarantees for polynomial coefficients from weakly dependent data with outliers
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2209293)