RLS parameter convergence with overparameterized models (Q1119237)

From MaRDI portal
scientific article
Language Label Description Also known as
English
RLS parameter convergence with overparameterized models
scientific article

    Statements

    RLS parameter convergence with overparameterized models (English)
    0 references
    0 references
    0 references
    1989
    0 references
    This paper makes use of the Lyapunov type function approach to generalize in several directions the existing results on the convergence properties of the recursive least-squares (RLS) algorithm applied to (noise-free) linear time-invariant difference equations. More specifically, the paper treats the case of overparameterized models, without assuming a priori knowledge of the degree of overparameterization; in addition, the weakest possible condition of excitation is imposed. The RLS estimate is shown to converge in a parameter set H the elements of which correspond to reducible parameterizations of the (irreducible) true transfer function. The initial conditions of the RLS algorithm are shown to be qualitatively related to the convergence points of the RLS. In particular, if the condition number of the initial covariance matrix in the RLS algorithm is chosen to be equal to one, then the algorithm will converge to the unique point in H which is closest to the initial parameter estimates (such a point always exists as H is a closed convex set).
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    Lyapunov type function
    0 references
    recursive least-squares
    0 references
    algorithm
    0 references
    overparameterized models
    0 references
    time-invariant
    0 references
    0 references
    0 references