Pathwise stochastic control problems and stochastic HJB equations.

*(English)*Zbl 1140.60031Authors’ summary: We study a class of pathwise stochastic control problems in which the optimality is allowed to depend on the paths of exogenous noise (or information). Such a phenomenon can be illustrated by considering a particular investor who wants to take advantage of certain extra information but in a completely legal manner. We show that such a control problem may not even have a “minimizing sequence,” but nevertheless the (Bellman) dynamical programming principle still holds. We then show that the corresponding Hamilton-Jacobi-Bellman equation is a stochastic partial differential equation, as was predicted by P.-L. Lion and P. E. Souganidis [C. R. Acad. Sci. Paris Sér. I Math., 327 735–741 (1998; Zbl 0924.35203)]. Our main device is a Doss-Sussmann-type transformation introduced in our previous work [(1) Stochastic Process. Appl. 93, No. 2, 181–204 (2001; Zbl 1053.60065)] and [(2) Stochastic Process. Appl. 93, No. 2, 205–228 (2001; Zbl 1053.60066)]. With the help of such a transformation we reduce the pathwise control problem to a more standard relaxed control problem, from which we are able to verify that the value function of the pathwise stochastic control problem is the unique stochastic viscosity solution to this stochastic partial differential equation, in the sense of [(1)+(2), loc.cit.].

Reviewer: A. V. Balakrishnan (Los Angeles)