Linear, finite-dimensional, continuous-time, stationary stochastic control systems are considered. The notions of feedback stabilization, exact terminal controllability and exact observability are introduced. Using the theory of linear stochastic differential equations and well-known results for feedback stabilizability, controllability and observability, necessary and sufficient conditions for these concepts to hold for stochastic control systems are formulated and proved. Moreover, some applications are pointed out and comparisons with the results existing in the literature are given. The concept of the unremovable spectrum is also introduced and discussed in detail. The paper contains many remarks and comments concerning stabilizability, observability and controllability of stochastic control systems. Finally, it should be pointed out, that similar problems have been discussed in the papers [

*A. E. Bashirov* and

*K. R. Kerimov*, “On controllability conception for stochastic systems”, SIAM J. Control Optimization 35, 384–398 (1997;

Zbl 0873.93076)] and [

*N. I. Mahmudov*, “Controllability of linear stochastic systems”, IEEE Trans. Autom. Control 46, 724–731 (2001;

Zbl 1031.93034)].