This paper deals with optimal control problems of 1-dimensional diffusion processes defined by the stochastic differential equation
with initial , where is a control process and a standard Brownian motion. So, has a control-dependent infinitesimal variance. The aim is to minimize the cost criterion
where is the hitting time to the boundary points, , , and a real constant.
Using the dynamic programming equation, the author investigates the value function and optimal control. In particular, he obtains explicit expressions for the value function and the optimal control when the functions , and are proportional to a power of .