×

zbMATH — the first resource for mathematics

Nonlinear dynamical systems and control. A Lyapunov-based approach. (English) Zbl 1142.34001
Princeton, NJ: Princeton University Press (ISBN 978-0-691-13329-4/hbk). xxvi, 948 p. (2008).
The main objective of this monograph is to present necessary mathematical tools for stability analysis and control design of nonlinear ODE systems, with an emphasis on Lyapunov-based methods.
Chapter 2 gives a systematic study of nonlinear dynamical systems theory and encloses qualitative properties of solutions, their existence, uniqueness, continuity and continuous dependence on initial conditions and parameters.
Ch. 3 presents stability theory for nonlinear dynamical systems and encloses Lyapunov stability theorems for autonomous nonlinear dynamical systems, converse Lyapunov theorems, Lyapunov instability theorems, several systematic approaches for construction of Lyapunov functions, stability of linear systems and Lyapunov’s linearization method.
Ch. 4 provides an advanced treatment of stability theory including partial stability, stability theory for non-autonomous systems, Lagrange stability, boundedness, ultimate boundedness, input-to-state stability, finite-time stability, semistability and stability theorems via vector Lyapunov functions. Lyapunov and asymptotic stability of sets, and stability of periodic orbits are studied in detail. Local and global stability theorems are proved by using lower semicontinuous Lyapunov functions. Generalized invariant sets theorems are derived wherein system trajectories converge to the union of largest invariant sets contained on the boundary of the intersections over finite intervals of the closure of generalized Lyapunov level surfaces.
By using the energy of the system and external energy supply, Ch. 5 gives a systematic treatment of dissipativity theory. Based on concepts of dissipativity theory, Ch. 6 presents feedback interconnection stability results for nonlinear dynamical systems. General stability criteria are given for Lyapunov, asymptotic and exponential stability of feedback dynamical systems. Using the notion of control Lyapunov function, feedback linearization, zero dynamics, minimum-phase systems and stability margins for nonlinear feedback systems, optimal control problems are considered, where a performance function is minimized over all possible closed-loop trajectories. Ch. 7 provides a brief treatment of input-output stability and dissipativity theory.
Ch. 8 provides a unified framework to address the problem of optimal nonlinear analysis and feedback control. Asymptotic stability of the closed-loop nonlinear system is guaranteed by means of a Lyapunov function which can clearly be seen to be the solution of the steady-state form of the Hamilton-Jacobi-Bellman equation and consequently guarantees both stability and optimality. Using the optimal control framework of Ch. 8, Ch. 9 presents a unification between nonlinear-nonquadratic optimal control and backstepping control. Using dissipativity theory with appropriate storage functions and supply rates, in Ch. 10 the nonlinear disturbance rejection problem is transformed into an optimal control problem by modifying a nonlinear-nonquadratic cost functional taking into account exogenous disturbances. It is shown that the Lyapunov function guaranteeing closed-loop stability is a solution of the steady-state Hamilton-Jacobi-Bellman equation for the controlled system.
Ch. 11 developes a unified framework to address the problem of optimal nonlinear robust control. As in the disturbance rejection problem, the given robust control problem is transformed into an optimal control problem by properly modifying the cost functional for the system uncertaintly. The resulting solution of the modified optimal control problem guarantees robust stability and performance for a class of nonlinear uncertain systems. In Ch. 12, extending the results of Ch. 11, robust stability of the closed-loop nonlinear system is guaranteed by means of a parameter-dependent Lyapunov function composed of a fixed and variable part. In Chs. 13 and 14 the authors give a condensed presentation of the continuous-time analysis and control synthesis results for discrete-time systems.

MSC:
34-01 Introductory exposition (textbooks, tutorial papers, etc.) pertaining to ordinary differential equations
34D20 Stability of solutions to ordinary differential equations
93D09 Robust stability
93D20 Asymptotic stability in control theory
93-01 Introductory exposition (textbooks, tutorial papers, etc.) pertaining to systems and control theory
PDF BibTeX XML Cite