Optimal control theory for infinite dimensional systems. (English) Zbl 0817.49001

Basel: Birkhäuser. xii, 448 p. (1994).
This book is devoted to the study of control problems for infinite- dimensional systems including stationary and evolution equations. The book is self-contained and it is divided in nine chapters.
Chapter 1 provides some typical control problems arising in infinite- dimensional spaces. Chapter 2 presents the mathematical preliminaries required for the reading of the book. These preliminaries include some functional analysis and (linear and semilinear) partial differential equations. The theory of existence of solution for optimal control problems is the objective of Chapter 3. Before proving the existence theorems, the authors devote two sections to the study of Souslin and Polish spaces as well as to the theory of multifunctions and selection theorems. Cesari theory is the starting point for the proof of the existence theorems. Chapters 4 and 5 contain the theory of necessary conditions for optimality (in the form of Pontryagin’s maximum principle) for abstract evolution and elliptic equations, respectively. The main feature of the proof of these conditions is a kind of spike perturbations distributed in a special form in the domain of the control, contrarily to the usual Pontryagin perturbations, which are localized around a point. Chapter 6 is devoted to the dynamic programming, including the study of the viscosity solutions of the Hamilton-Jacobi-Bellman equations. Two topics are considered in Chapter 7: the controllability of linear and semilinear systems and the theory of time optimal control. The dynamic programming method is applied to the study of optimal switching and impulse controls in Chapter 8. A unified theory is presented under the framework of Chapter 6. Chapter 9, the last one, is concerned with the theory of linear quadratic optimal control problems with finite or infinite horizon, studying the corresponding Riccati equations.


49-02 Research exposition (monographs, survey articles) pertaining to calculus of variations and optimal control
49J20 Existence theories for optimal control problems involving partial differential equations
49K20 Optimality conditions for problems involving partial differential equations
49L20 Dynamic programming in optimal control and differential games
93B52 Feedback control
49L25 Viscosity solutions to Hamilton-Jacobi equations in optimal control and differential games
93B05 Controllability