Summary: This paper is concerned with the control of continuous-time linear systems that possess randomly jumping parameters which can be described by finite-state Markov processes. The relationship between appropriately defined controllability, stabilizability properties and the solution of the infinite time jump linear quadratic (JLQ) optimal control problem is also examined. Although the solution of the continuous-time Markov JLQ problem with finite or infinite time horizons is known, only sufficient conditions for the existence of finite cost, constant, stabilizing controls for the infinite time problem appear in the literature.
In this paper, necessary and sufficient conditions are established. These conditions are based upon new definitions of controllability, observability, stabilizability, and detectability that are appropriate for continuous-time Markovian jump linear systems. These definitions play the same role for the JLQ problem as the deterministic properties do for the linear quadratic regulator (LQR) problem.