Generalization bounds for learning weighted automata
From MaRDI portal
Publication:1704563
DOI10.1016/J.TCS.2017.11.023zbMATH Open1388.68148arXiv1610.07883OpenAlexW2774279804MaRDI QIDQ1704563FDOQ1704563
Authors: Borja Balle, Mehryar Mohri
Publication date: 12 March 2018
Published in: Theoretical Computer Science (Search for Journal in Brave)
Abstract: This paper studies the problem of learning weighted automata from a finite labeled training sample. We consider several general families of weighted automata defined in terms of three different measures: the norm of an automaton's weights, the norm of the function computed by an automaton, or the norm of the corresponding Hankel matrix. We present new data-dependent generalization guarantees for learning weighted automata expressed in terms of the Rademacher complexity of these families. We further present upper bounds on these Rademacher complexities, which reveal key new data-dependent terms related to the complexity of learning weighted automata.
Full work available at URL: https://arxiv.org/abs/1610.07883
Recommendations
- Learning weighted automata
- Weighted automata are compact and actively learnable
- A lower bound for learning distributions generated by probabilistic automata
- Learning weighted automata over principal ideal domains
- scientific article; zbMATH DE number 7649925
- Spectral learning of weighted automata. A forward-backward perspective
- On the Rademacher complexity of weighted automata
- On Optimal Learning Algorithms for Multiplicity Automata
- Algorithmic Learning Theory
Cites Work
- Biological Sequence Analysis
- Title not available (Why is that?)
- Understanding machine learning. From theory to algorithms
- Uniform Central Limit Theorems
- An introduction to matrix concentration inequalities
- Title not available (Why is that?)
- Title not available (Why is that?)
- Combinatorial methods in density estimation
- Handbook of weighted automata
- Title not available (Why is that?)
- Noncommutative rational series with applications
- Matrices de Hankel
- A trace inequality of John von Neumann
- Title not available (Why is that?)
- Foundations of machine learning
- Title not available (Why is that?)
- Title not available (Why is that?)
- Model checking linear-time properties of probabilistic systems
- Realizations by stochastic finite automata
- Weighted automata algorithms
- On the computational complexity of approximating distributions by probabilistic automata
- Spectral learning of weighted automata. A forward-backward perspective
- Lp DISTANCE AND EQUIVALENCE OF PROBABILISTIC AUTOMATA
- Absolute convergence of rational series is semi-decidable
- Rational kernels: theory and algorithms
- VC-dimensions of finite automata and commutative finite automata with \(k\) letters and \(n\) states
- Learning weighted automata
- Title not available (Why is that?)
- Dimension-free concentration bounds on Hankel matrices for spectral learning
- On the Rademacher complexity of weighted automata
- Formal Analysis of Online Algorithms
- A Canonical Form for Weighted Automata and Applications to Approximate Minimization
Cited In (5)
Uses Software
This page was built for publication: Generalization bounds for learning weighted automata
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1704563)