Recovery guarantees for polynomial coefficients from weakly dependent data with outliers
DOI10.1016/J.JAT.2020.105472zbMATH Open1451.41013arXiv1811.10115OpenAlexW3071135787MaRDI QIDQ2209293FDOQ2209293
Authors: Lam Si Tung Ho, Hayden Schaeffer, Giang Tran, Rachel Ward
Publication date: 30 October 2020
Published in: Journal of Approximation Theory (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1811.10115
Recommendations
- SPARSE AND ROBUST LINEAR REGRESSION: AN OPTIMIZATION ALGORITHM AND ITS STATISTICAL PROPERTIES
- Learning sparse polynomial functions
- Correcting data corruption errors for multivariate function approximation
- Signal recovery from incomplete measurements in the presence of outliers
- Learning non-parametric basis independent models from point queries via low-rank methods
Multidimensional problems (41A63) Approximation by polynomials (41A10) Approximations and expansions (41A99)
Cites Work
- The elements of statistical learning. Data mining, inference, and prediction
- Regularization algorithms for learning that are equivalent to multilayer networks
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- Nonlinear Regression with Dependent Observations
- Empirical Processes with Applications to Statistics
- Title not available (Why is that?)
- Title not available (Why is that?)
- Discovering governing equations from data by sparse identification of nonlinear dynamical systems
- Machine learning: Trends, perspectives, and prospects
- A CENTRAL LIMIT THEOREM AND A STRONG MIXING CONDITION
- Title not available (Why is that?)
- Proximal splitting methods in signal processing
- Title not available (Why is that?)
- A mathematical introduction to compressive sensing
- System identification in the presence of outliers and random noises: a compressed sensing approach
- High-dimensional adaptive sparse polynomial interpolation and applications to parametric PDEs
- Learning functions of few arbitrary linear parameters in high dimensions
- Compressed sensing and matrix completion with constant proportion of corruptions
- Effectively Well-Conditioned Linear Systems
- Title not available (Why is that?)
- Pattern recognition for conditionally independent data
- Interpolation via weighted \(\ell_{1}\) minimization
- A Bernstein-type inequality for some mixing processes and dynamical systems with an application to learning
- Approximation of infinitely differentiable multivariate functions is intractable
- Corrupted Sensing: Novel Guarantees for Separating Structured Signals
- Minimum complexity regression estimation with weakly dependent observations
- Learning from dependent observations
- EXPONENTIAL INEQUALITIES AND FUNCTIONAL ESTIMATIONS FOR WEAK DEPENDENT DATA: APPLICATIONS TO DYNAMICAL SYSTEMS
- On the Convergence of the SINDy Algorithm
- Modeling and nonlinear parameter estimation with Kronecker product representation for coupled oscillators and spatiotemporal systems
- Uniform Recovery Bounds for Structured Random Matrices in Corrupted Compressed Sensing
- Hidden physics models: machine learning of nonlinear partial differential equations
- Optimal weighted least-squares methods
- Polynomial approximation via compressed sensing of high-dimensional functions on lower sets
- Extracting Sparse High-Dimensional Dynamics from Limited Data
- Learning partial differential equations via data discovery and sparse optimization
- Sparse identification of nonlinear dynamics for model predictive control in the low-data limit
- Fast learning from \(\alpha\)-mixing observations
- Learning from Non-iid Data: Fast Rates for the One-vs-All Multiclass Plug-in Classifiers
- Sequential function approximation with noisy data
- Generalization and Robustness of Batched Weighted Average Algorithm with V-Geometrically Ergodic Markov Data
- Compressed Sensing with Sparse Corruptions: Fault-Tolerant Sparse Collocation Approximations
Cited In (5)
- Towards optimal sampling for learning sparse approximation in high dimensions
- Adaptive group Lasso neural network models for functions of few variables and time-dependent data
- Extracting Structured Dynamical Systems Using Sparse Optimization With Very Few Samples
- A generalization bound of deep neural networks for dependent data
- Model selection of chaotic systems from data with hidden variables using sparse data assimilation
Uses Software
This page was built for publication: Recovery guarantees for polynomial coefficients from weakly dependent data with outliers
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2209293)