Dynamical systems and their applications: linear theory (Q761385)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Dynamical systems and their applications: linear theory
scientific article

    Statements

    Dynamical systems and their applications: linear theory (English)
    0 references
    0 references
    1977
    0 references
    This book is of interest to research workers in systems theory who wish to acquaint themselves with the dynamical approach to (or with the description of) the behavior of diverse systems. The mathematical prerequisite is elementary knowledge of set theory, of matrix algebra, and of ordinary differential equations. This book is dedicated to the memory of the famous scientist Aleksandr Mikhajlovich Letov. It contains eight chapters but can be classified by its contents into four main parts: Chapters 1 and 2 - an introduction; Chapters 3, 4 and 5 - structural properties of systems; Chapter 6 - modeling; Chapters 7 and 8 - the behaviour of systems. Most of the theorems given in this book are stated without proof, but arguments are advanced beforehand, either clarifying the meaning of the theorem or offering an outline of the proof. At the end of each chapter the author offers a short bibliography with comments pointing out where one can find a proof of a specific theorem or a more detailed discussion of certain questions, or pointing out the source materials. Basically the author investigates continuous systems and, rather briefly, also some problems concerning the dynamic properties of discrete systems. The exposition is illustrated by examples from various disciplines, such as the dynamics of water balance, industrial applications, economic planning, population migration, the flow of street traffic, the armament race, etc. Each chapter contains exercises, where the material presented is of independent interest. The first chapter is devoted to the exposition of basic concepts, which are illustrated by specific examples. Concepts which are used later in the text are explained. These include, for example: input, output, state, the description and realization of a system, controllability, observability, stability, feedback, optimality, and stochastic perturbations. In the second chapter the author defines a dynamic system, gives a description of a linear dynamic system, the external and the internal description of a system, the idea of frequency techniques, the transfer function and the impulse response of a system. In the third chapter the author discusses controllability and attainability, and examines in detail these concepts applied to systems which are described by a system of ordinary linear differential equations and equations with retarded argument, and to discrete systems, introducing conditions for controllability and for structural controllability. The fourth chapter is devoted to observability and to synthesis. Basic concepts and theorems and the method of moments are introduced. More theoretically inclined problems are discussed in Chapter 5; namely, structural theorems, controllable and observable canonical forms, observability of canonical forms, invariant properties of transfer matrices, and the feedback group and its invariant properties. The sixth chapter is entirely devoted to the problem of systems realization. The following problem is investigated: If the behavior of the system is given in the input-output form, is it possible to describe the process by differential or by finite difference equations? If the answer is yes, then is such description unique? Equivalent algebraic systems are investigated, and the author discusses some minimization problems, i.e. problems of simplest realizations, and the algorithms for synthesizing of such systems, for constructive synthesis, and for realization of the transfer function. The seventh chapter discusses the important problem of stability of dynamical systems. First the concept of stability is illustrated by simple examples; then stability in the sense of Lyapunov is defined, and methods for investigation of stability, such as Routh, Hurwitz, Lyapunov and the frequency techniques, are given. The author discusses the stability of feedback systems, of modal control systems and questions of structural stability. The eighth chapter considers optimization of processes that are described by a linear system of ordinary differential equations and have an integral quadratic quality criterion. The theorem is quoted on the maximality principle (of Pontryagin), optimal control is discussed for feedback systems, and some concepts of numerical computations are introduced. Other topics include discrete systems, structural stability of closed loops, optimal systems, linear filtering theory and problems of control for stochastic systems. This book can serve as a guide by providing sufficient rigor and simplicity to the fundamental questions of the linear theory of dynamical systems and by pointing out the bibliography necessary for pursuing a more detailed or advanced study of each topic.
    0 references
    0 references
    0 references
    0 references
    0 references
    continuous systems
    0 references
    dynamic properties of discrete systems
    0 references
    dynamics of water balance
    0 references
    industrial applications
    0 references
    economic planning
    0 references
    population migration
    0 references
    flow of street traffic
    0 references
    armament race
    0 references
    stability of dynamical systems
    0 references
    integral quadratic quality criterion
    0 references
    maximality principle
    0 references
    structural stability of closed loops
    0 references
    optimal systems
    0 references
    linear filtering theory
    0 references
    control for stochastic systems
    0 references