This article centers around the relationship between stability and optimality. More precisely the question is considered if a robust control Lyapunov function for a control affine nonlinear system with constrained control and perturbation can be expressed as a solution of a Hamilton-Jacobi-Isaacs equation and hence enjoys certain optimality properties.
The first part of this paper shows the relation between robust stability, robust asymptotic stability and input-to-state stability and the existence of associated control Lyapunov functions. The results also cover the case of nonsmooth feedback laws and infinite dimensional systems and hence give an extension of Artstein’s theorem.
The second part is devoted to the question of how to synthesize a control law from the knowledge of the control Lyapunov functions. Here pointwise min-norm control laws are discussed and applied to several types of control systems.
Finally the authors turn to the original question and show that for any robust control Lyapunov function there exists a differential game such that the given function forms a solution of the corresponding Hamilton-Jacobi-Isaacs equation.