×

zbMATH — the first resource for mathematics

Mathematical control theory. Deterministic finite dimensional systems. (English) Zbl 0703.93001
Texts in Applied Mathematics, 6. New York etc.: Springer-Verlag. xiii, 396 p. DM 78.00/hbk (1990).
The book gives a thorough and mathematically rigorous treatment of control and system theory. The subtitle of the book, deterministic finite dimensional systems, gives a rather precise delimitation of the contents. Both continuous time and discrete systems are treated. Though the emphasis is more on linear than on nonlinear systems, many sections are devoted to nonlinear systems exclusively. In an area as wide as control and system theory, it is impossible to cover all topics in a single text. The author is honest in admitting this and he also indicates that topics as robust control, adaptive control are not covered. In the Notes and Comments sections at the end of each chapter many recent developments are briefly mentioned and some references are given with respect to these developments. Examples of such directions are discrete event systems, chaotic systems, \(H_{\infty}\) control, large scale systems. The core of the book deals in depth with rather standard subjects (see the detailed description later on). Examples are mainly of an academic nature (the inverted pendulum is not missing); the author gives and discusses references which contain more realistic engineering problems.
The book is written in a clear style and will specifically appeal to the mathematically oriented reader, at the level of mature advanced undergraduate or beginning graduate. This style is in the form of definition, theorem, proof. Exercises (proofs of lemmas are exercises also) are interspersed through the text. As prerequisite the author mentions a working knowledge of linear algebra and differential equations. At various points in the book notions from other areas of mathematics are used. The reader will appreciate many sections on continuous time systems better if he knows about measure theory. The author claims that if “measurable function” is replaced by “piecewise continuous function”, the reader can follow the train of thought. This may be true, but then some elegance is lost.
In conclusion, this is an excellent book for readers interested in the mathematical foundations of control and system theory. Within that context it may become (one of) the standard text(s). As a first introduction to control and system theory it may be less suitable, unless the instructor is willing to elaborate on many subjects.
Now a more detailed description of the contents follows. Chapter 1 describes the main contents of the book in an intuitive and informal manner. Various concepts are loosely introduced by means of simple examples. Such concepts conclude precomputed (open loop) versus feedback (closed loop) controls, linearity, linearization, state, design in the frequency domain, observers, (dis-)advantages of P(I)D controllers, controllability, observability, pole shifting. The necessity of differential geometry for nonlinear systems is also discussed.
Chapter 2 provides the basic definitions (of system, trajectory, i/o map, time invariance, impulse response, causality, etc.), a classification and properties of systems. Discrete time and continuous time systems are treated in different sections. Most of the definitions are given with respect to an arbitrary field K, though the emphasis is on the fields \({\mathbb{R}}\) and \({\mathbb{C}}\). Separate sections are devoted to linearization of continuous time systems. Other sections deal with sampling and Volterra expansions.
Chapter 3 discusses various notions of reachability and controllability in detail. As an example controllability under sampling is treated. Both time invariant and time-varying systems are considered. For nonlinear systems first-order local controllability is treated. Towards that end, topological systems are introduced as well as the Lie-algebra formalism.
The subject of Chapter 4 is feedback and stabilization. Various notions of stability are discussed (such as Lyapunov stability). The pole shifting theorem belongs to this chapter also. Other items dealt with are controllability-indices, disturbance rejection and (A,B)-invariance. The author has a preference for the words “asymptotic controllability” above the more familiar “stabilizability”.
Chapter 5 deals which outputs and is somewhat dual to Chapter 3. Several notions of observability and distinguishability are discussed (such as w.r.t. the original state, the final state, and control-dependent distinguishability). Some generalizations to polynomial systems are made. Realization theory (factorization of a Markov-sequence) is also part of this chapter.
Chapter 6 talks about observers and dynamic feedback. Detectability is called asymptotic observability. External stability and the Nyquist criterion form part of this chapter also.
Chapter 7 deals with optimal control. The emphasis is on linear/quadratic (LQ) problems. Though the maximum principle of Pontryagin is mentioned, derivations of necessary and sufficient conditions follow essentially the dynamic programming approach. The infinite time LQ problem is dealt with also and precise conditions/derivations are given under which the algebraic Riccati-equation makes sense for this problem. Furthermore sections are devoted to tracking and to Kalman filtering. The interpretation of this filter, and its derivation, is deterministic along the lines of an optimal control problem.
Appendices (on Linear Algebra, Differentials, Ordinary Differential Equations), an extensive list of 397 references and an index conclude the book.
Reviewer: G.J.Olsder

MSC:
93-01 Introductory exposition (textbooks, tutorial papers, etc.) pertaining to systems and control theory
49N10 Linear-quadratic optimal control problems
PDF BibTeX XML Cite