The authors consider a complicated version of controlled stochastic systems. The time
is measured continuously. The state of the system is represented by a continuous variable
and a discrete variable
. Also, the control has two parts, a continuous type control
that is a measurable stochastic process and a discrete-type (or impulse) control
that is a sequence of random variables. The key point is the set-interface
of which only the boundary is really used. Minimal and maximal set-interfaces are considered. When the state reaches the minimal set, a mandatory impulse (jump or switch) takes place, while if the state belongs to a maximal set, an optional impulse (jump or switch) may be applied, upon decision of the controller. Switching and jumps can be autonomous or totally controlled. A discounted marginal cost of the form
is introduced and a control problem consists in its minimization. The authors demonstrate that the dynamic programming approach leads to some involved quasi-variational inequality. If the system is non-degenerate then the classic treatment can be used for the solution of the control problem, otherwise, a way is to use the so-called viscosity solutions that are described in the last part of the paper.