×

Mathematical control theory: an introduction. (English) Zbl 1071.93500

From the preface: The aim of this book is to give a self-contained outline of mathematical control theory. The work consciously concentrates on typical and characteristic results, presented in four parts preceded by an introduction. The introduction surveys basic concepts and questions of the theory and describes typical, motivating examples.
Part I is devoted to structural properties of linear systems. It contains basic results on controllability, observability, stability and stabilizability. A separate chapter covers realization theory. Toward the end more special topics are treated: linear systems with bounded sets of control parameters and the so-called positive systems.
Structural properties of nonlinear systems are the content of Part II, which is similar in setting to Part I. It starts from an analysis of controllability and observability and then discusses in great detail stability and stabilizability. It also presents typical theorems on nonlinear realizations.
Part III concentrates on the question of how to find optimal controls. It discusses Bellman’s optimality principle and its typical applications to the linear regulator problem and to impulse control. It gives a proof of Pontryagin’s maximum principle for classical problems with fixed control intervals as well as for time-optimal and impulse control problems. Existence problems are considered in the final chapters, which also contain the basic Fillipov theorem.
Part IV is devoted to infinite-dimensional systems. The course is limited to linear systems and to the so-called semigroup approach. The first chapter treats linear systems without control and is, in a sense, a concise presentation of the theory of semigroups of linear operators. The following two chapters concentrate on controllability, stability and stabilizability of linear systems and the final one on the linear regulator problem in Hilbert spaces.
Besides classical topics the book also discusses less traditional ones. In particular, great attention is paid to realization theory and to geometrical methods of analysis of controllability, observability and stabilizability of linear and nonlinear systems. One can find here recent results on positive, impulsive and infinite-dimensional systems. To preserve some uniformity of style, discrete systems as well as stochastic ones have not been included. This was a conscious compromise. Each would be worthy of a separate book.
Control theory is today a separate branch of mathematics, and each of the topics covered in this book has an extensive literature. Therefore the book is only an introduction to control theory.
Knowledge of basic facts from linear algebra, differential equations and calculus is required. Only the final part of the book assumes familiarity with more advanced mathematics.

MSC:

93-01 Introductory exposition (textbooks, tutorial papers, etc.) pertaining to systems and control theory
93Bxx Controllability, observability, and system structure
49Kxx Optimality conditions
34H05 Control problems involving ordinary differential equations
47N70 Applications of operator theory in systems, signals, circuits, and control theory