Boundary crossing of Brownian motion. Its relation to the law of the iterated logarithm and to sequential analysis.

*(English)*Zbl 0604.62075
Lecture Notes in Statistics, 40. Berlin etc.: Springer-Verlag. V, 142 p. DM 28.50 (1986).

This monograph has two parts. The first chapter is purely probabilistic; the second treats two statistical problems in sequential analysis.

The first chapter studies the (stopping) time T at which a standard Brownian motion W(t) first crosses a boundary \(\psi\) (t), \(t>0\). Among the stopping bounds \(\psi\) (t) to be considered are the functions \(\psi_ a(t)\), \(a>0\), where \(\psi_ a(t)\) is the solution for x of \(h(x,t)=0\). Here \[ h(x,t)=t^{-} \phi (xt^{-})-a^{- 1}\int^{\infty}_{0}t^{-} \phi ((x-\theta)t^{-})F(d\theta) \] in which F is a sigma-finite measure on the positive half line and \(\phi\) is the standard normal density. (This is called the method of images. One can think of h(x,t) as the temperature at (x,t) after placing at \(t=0\) a unit amount of heat at \(x=0\) and a negative amount of heat with distribution \(a^{-1} F(d\theta)\) on the positive half line. Then \(\psi_ a(t)\) is the value of x at time t where the temperature is 0.) Another important function is \[ f(y,s)=\int^{\infty}_{0}\exp (\theta y-2^{-1} \theta^ 2 s)F(d\theta) \] and \(\psi_ a(t)\) is also the solution for x of \(f(xt^{-1},t^{-1})=a\). The stopping time is defined as \(T=\inf \{t>0:\) W(t)\(\geq \psi (t)\}\). It is shown that on the set \(x\leq \psi (t)\) we have \(P\{T>t\), \(W(t)\in (x,x+dx)\}=h(x,t)dx\) and \(P\{T\leq t| W(t)=x\}=a^{-1} f(xt^{-1},t^{-1})\). It is also shown how this is related to a known result of H. Robbins and D. Siegmund [Ann. Math. Stat. 41, 1410-1429 (1970; Zbl 0255.60058)] on first crossing by W(t) of a boundary \(\eta_ a(t)\) defined as the solution for x of \(f(x,t)=a.\)

For \(\psi_ a\) defined by the method of images an explicit formula for the density \(p_ a(t)\) of T is derived. The remainder of the chapter is devoted to the tangent approximation to \(p_ a(t)\), which results if the actual bound \(\psi_ a(t)\) is replaced by the tangent to the graph of \(\psi_ a(t)\) at \((\psi_ a(t),t)\). It is shown that the ratio of approximate to exact expression converges to 1 as \(a\to \infty\) uniformly on any finite interval. The same problem is also treated for stopping bounds \(\psi_ a(t)\) that do not necessarily arise from the method of images. The question of uniform convergence on the whole positive half line or on finite intervals that grow with a is also considered.

The second chapter considers two statistical problems, where sequential sampling of a normal population with known variance but unknown mean is replaced by observing a Wiener process W(t) with unknown drift \(\theta\). In the first problem the hypothesis \(\theta =0\) is to be tested against \(\theta\neq 0\) with a test of power one when the cost of observing the process until time t is proportional to \(t\theta^ 2\) and at stopping the hypothesis is rejected (no loss is incurred if the process never stops). In the second problem one wants to test \(\theta\leq 0\) versus \(\theta >0\) under 0-1 loss and cost function as above. Bayes procedures are derived after putting a normal prior on \(\theta\) plus in the first problem a positive probability on \(\theta =0\). In the second problem the solution can be given explicitly and is a repeated significance test. In the first problem such explicit characterization is not possible. Instead, upper and lower bounds are provided by two members of the family of tests with stopping time \(T_{\lambda}\), \(\lambda >0\), defined by the first time that the posterior probability of \(\theta =0\) is \(\leq \lambda\).

The first chapter studies the (stopping) time T at which a standard Brownian motion W(t) first crosses a boundary \(\psi\) (t), \(t>0\). Among the stopping bounds \(\psi\) (t) to be considered are the functions \(\psi_ a(t)\), \(a>0\), where \(\psi_ a(t)\) is the solution for x of \(h(x,t)=0\). Here \[ h(x,t)=t^{-} \phi (xt^{-})-a^{- 1}\int^{\infty}_{0}t^{-} \phi ((x-\theta)t^{-})F(d\theta) \] in which F is a sigma-finite measure on the positive half line and \(\phi\) is the standard normal density. (This is called the method of images. One can think of h(x,t) as the temperature at (x,t) after placing at \(t=0\) a unit amount of heat at \(x=0\) and a negative amount of heat with distribution \(a^{-1} F(d\theta)\) on the positive half line. Then \(\psi_ a(t)\) is the value of x at time t where the temperature is 0.) Another important function is \[ f(y,s)=\int^{\infty}_{0}\exp (\theta y-2^{-1} \theta^ 2 s)F(d\theta) \] and \(\psi_ a(t)\) is also the solution for x of \(f(xt^{-1},t^{-1})=a\). The stopping time is defined as \(T=\inf \{t>0:\) W(t)\(\geq \psi (t)\}\). It is shown that on the set \(x\leq \psi (t)\) we have \(P\{T>t\), \(W(t)\in (x,x+dx)\}=h(x,t)dx\) and \(P\{T\leq t| W(t)=x\}=a^{-1} f(xt^{-1},t^{-1})\). It is also shown how this is related to a known result of H. Robbins and D. Siegmund [Ann. Math. Stat. 41, 1410-1429 (1970; Zbl 0255.60058)] on first crossing by W(t) of a boundary \(\eta_ a(t)\) defined as the solution for x of \(f(x,t)=a.\)

For \(\psi_ a\) defined by the method of images an explicit formula for the density \(p_ a(t)\) of T is derived. The remainder of the chapter is devoted to the tangent approximation to \(p_ a(t)\), which results if the actual bound \(\psi_ a(t)\) is replaced by the tangent to the graph of \(\psi_ a(t)\) at \((\psi_ a(t),t)\). It is shown that the ratio of approximate to exact expression converges to 1 as \(a\to \infty\) uniformly on any finite interval. The same problem is also treated for stopping bounds \(\psi_ a(t)\) that do not necessarily arise from the method of images. The question of uniform convergence on the whole positive half line or on finite intervals that grow with a is also considered.

The second chapter considers two statistical problems, where sequential sampling of a normal population with known variance but unknown mean is replaced by observing a Wiener process W(t) with unknown drift \(\theta\). In the first problem the hypothesis \(\theta =0\) is to be tested against \(\theta\neq 0\) with a test of power one when the cost of observing the process until time t is proportional to \(t\theta^ 2\) and at stopping the hypothesis is rejected (no loss is incurred if the process never stops). In the second problem one wants to test \(\theta\leq 0\) versus \(\theta >0\) under 0-1 loss and cost function as above. Bayes procedures are derived after putting a normal prior on \(\theta\) plus in the first problem a positive probability on \(\theta =0\). In the second problem the solution can be given explicitly and is a repeated significance test. In the first problem such explicit characterization is not possible. Instead, upper and lower bounds are provided by two members of the family of tests with stopping time \(T_{\lambda}\), \(\lambda >0\), defined by the first time that the posterior probability of \(\theta =0\) is \(\leq \lambda\).

Reviewer: R.A.Wijsman

##### MSC:

62L15 | Optimal stopping in statistics |

62-02 | Research exposition (monographs, survey articles) pertaining to statistics |

60-02 | Research exposition (monographs, survey articles) pertaining to probability theory |

60G40 | Stopping times; optimal stopping problems; gambling theory |

62L10 | Sequential statistical analysis |

60J65 | Brownian motion |