Fleming, Wendell H. Risk sensitive stochastic control and differential games. (English) Zbl 1132.93049 Commun. Inf. Syst. 6, No. 3, 161-178 (2006). Summary: We give a concise introduction to risk sensitive control of Markov diffusion processes and related two-controller, zero-sum differential games. The method of dynamic programming for the risk sensitive control problem leads to a nonlinear partial differential equation of Hamilton-Jacobi-Bellman type. In the totally risk sensitive limit, this becomes the Isaacs equation for the differential game. There is another interpretation of the differential game using the Maslov idempotent probability calculus. We call this a max-plus stochastic control problem. These risk sensitive control/differential game methods are applied to problems of importance sampling for Markov diffusions. Cited in 11 Documents MSC: 93E20 Optimal stochastic control 60J25 Continuous-time Markov processes on general state spaces 49L20 Dynamic programming in optimal control and differential games 49N70 Differential games and control Keywords:Markov diffusion processes; zero-sum differential games; dynamic programming PDF BibTeX XML Cite \textit{W. H. Fleming}, Commun. Inf. Syst. 6, No. 3, 161--178 (2006; Zbl 1132.93049) Full Text: DOI OpenURL