Optimal Sampling of Parametric Families: Implications for Machine Learning
From MaRDI portal
Publication:5131174
DOI10.1162/neco_a_01251zbMath1473.68149OpenAlexW2986435634WikidataQ91189930 ScholiaQ91189930MaRDI QIDQ5131174
Jithendar Anumula, Adrian E. G. Huber, Shihchii Liu
Publication date: 2 November 2020
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://www.zora.uzh.ch/id/eprint/184150/1/document09.56.pdf
Inference from stochastic processes and prediction (62M20) Learning and adaptive systems in artificial intelligence (68T05) Diffusion processes (60J60) Sequential estimation (62L12)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Predictors for the first-order autoregressive process
- The strong ergodic theorem for densities: Generalized Shannon-McMillan- Breiman theorem
- Theory of statistical inference and information. Transl. from the Slovak by the author
- Universal coding, information, prediction, and estimation
- Properties of Predictors for Autoregressive Time Series
- Universal prediction
- A strong version of the redundancy-capacity theorem of universal coding
- Fisher information and stochastic complexity
- Nonparametric risk bounds for time-series forecasting
This page was built for publication: Optimal Sampling of Parametric Families: Implications for Machine Learning