Discrete-time jump linear systems , , with initial state , are considered. It is assumed that the x-process, and the control vector u are m-dimensional and that the form process is a finite-state Markov chain taking values in . Further, it is assumed that the cost criterion is quadratic.
First, the optimal control law is presented. This optimal control law is linear in at each time K, and it is different (in general) for each possible set of parameter values. Further, necessary and sufficient conditions for the existence of a steady-state optimal controller are given. The results are illustrated by examples.