Metric entropy limits on recurrent neural network learning of linear dynamical systems
DOI10.1016/J.ACHA.2021.12.004zbMath1487.93022arXiv2105.02556OpenAlexW3159224300WikidataQ114953273 ScholiaQ114953273MaRDI QIDQ2134114
Helmut Bölcskei, Clemens Hutter, Recep Gül
Publication date: 6 May 2022
Published in: Applied and Computational Harmonic Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2105.02556
system identificationHardy spaceslinear dynamical systemsrecurrent neural networksmetric entropyuniversal approximation
Artificial neural networks and deep learning (68T07) System identification (93B30) Discrete-time control/observation systems (93C55) Linear systems in control theory (93C05)
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Fast projection methods for minimal design problems in linear system theory
- Approximating nonlinear fading-memory operators using neural network models
- Reconstructing a neural net from its output
- Foundations of time-frequency analysis
- Multilayer feedforward networks are universal approximators
- On the computational power of neural nets
- Neural network identifiability for a family of sigmoidal nonlinearities
- Affine symmetries and neural network identifiability
- Unconditional bases and bit-level compression
- The uncertainty principle
- On the metric complexity of casual linear systems: ε -Entropy and ε -Dimension for continuous time
- Pick's Theorem-What's the Big Deal?
- Gradient Descent Learns Linear Dynamical Systems
- High-Dimensional Statistics
- Data compression and harmonic analysis
- Deep Neural Network Approximation Theory
- Optimal Approximation with Sparsely Connected Deep Neural Networks
- Analysis of the Generalization Error: Empirical Risk Minimization over Deep Artificial Neural Networks Overcomes the Curse of Dimensionality in the Numerical Approximation of Black--Scholes Partial Differential Equations
- Introduction to Mathematical Systems Theory
- A note on metric dimension and feedback in discrete time
- Approximation by superpositions of a sigmoidal function
- Sparse components of images and optimal atomic decompositions
This page was built for publication: Metric entropy limits on recurrent neural network learning of linear dynamical systems