Optimal control and replacement with state-dependent failure rate: Dynamic programming. (English) Zbl 0781.93097

Summary: A class of stochastic control problems where the payoff depends on the running maximum of a diffusion process is described. The controller must make two kinds of decision: first, he must choose a work rate (this decision determines the rate of profit as well as the proximity of failure), and second, he must decide when to replace a deteriorated system with a new one. Preventive replacement is a realistic option if the cost for replacement after failure is larger than the cost of a preventive replacement.
We focus on the profit and replacement cost for a single work cycle and solve the problem in two stages. First, the optimal feedback control (work rate) is determined by maximizing the payoff during a single excursion of a controlled diffusion away from the running maximum. This step involves the solution of the Hamilton-Jacobi-Bellman (HJB) partial differential equation. The second step is to determine the optimal replacement set. The assumption that failure occurs only on the set where the state is increasing implies that replacement is optimal only on this set. This leads to a simple formula for the optimal replacement level in terms of the value function.


93E20 Optimal stochastic control
90B25 Reliability, availability, maintenance, inspection in operations research
49L20 Dynamic programming in optimal control and differential games
Full Text: DOI