Hopfield neural networks (NNs) were very popular more than ten years ago. Since that time there have been many attempts to find out stability conditions for their equilibria. Alas, as Hopfield NNs are strongly nonlinear dynamical systems, such analysis is in general impossible. It also means one cannot apply the Lyapunov method in this case in general. The reason is that a divergence of vector flow of such systems can be positive. So the equilibrium, e.g., in the case of a non-symmetric weight matrix, can be unstable, in fact, one may get limit cycles, etc. The paper is poorly written. The given proof of sufficient conditions for equilibria to have globally exponential stability contains many holes and pitfalls. Besides, the Lyapunov stability analysis cannot be realized analytically at all. So the conclusions made in the paper concerning a convergence rate of learning and reduction of neural computational time are false, too.