Convergence of the least-squares method with a polynomial regularizer for the infinite-dimensional autoregression equation
From MaRDI portal
Publication:2386486
DOI10.1007/s10513-005-0009-1zbMath1130.93436OpenAlexW4255650174MaRDI QIDQ2386486
Yulia R. Gel, Andrey E. Barabanov
Publication date: 23 August 2005
Published in: Automation and Remote Control (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10513-005-0009-1
Lua error in Module:PublicationMSCList at line 37: attempt to index local 'msc_result' (a nil value).
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Strong convergence of the method of least squares
- Asymptotically efficient selection of the order of the model for estimating parameters of a linear process
- Asymptotic properties of projections with applications to stochastic regression problems
- The identification of a linear model of a stationary process by its realization
- Nonasymptotic bounds for autoregressive time series modeling.
- A covariance extension approach to identification of time series
- Estimation and information in stationary time series
- Testing for unit roots in autoregressive-moving average models of unknown order
- AR( infinity ) estimation and nonparametric stochastic complexity
- Analysis of recursive stochastic algorithms
- ESTIMATION OF AUTOREGRESSIVE MOVING-AVERAGE MODELS VIA HIGH-ORDER AUTOREGRESSIVE APPROXIMATIONS
- Identification of an unstable ARMA equation
- A new look at the statistical model identification
This page was built for publication: Convergence of the least-squares method with a polynomial regularizer for the infinite-dimensional autoregression equation