On the dissipation of energy upon classical measurement and on the relation between information and entropy.

*(English. Russian original)*Zbl 0916.94002
Probl. Inf. Transm. 34, No. 2, 147-149 (1998); translation from Probl. Peredachi Inf. 34, No. 2, 62-64 (1998).

From the text: The author considers the following model of classical measurement. The measurement device is a mechanical (or electromechanical) system, which transforms the measured value into a movement of the pointer. The pointer is moved by a force applied to the system\(\dots\).

The author establishes the following inequality, which associates the relative measurement precision with the energy dissipation: (1) \(Q>{1\over 2}kT\alpha^2\), \(\alpha=\Delta/\sigma\), where \(Q\) is the amount of the dissipated energy, \(\alpha\) is the relative measurement precision, \(T\) is the temperature, \(k\) is Boltzmann’s constant, \(\Delta\) is the pointer movement, and \(\sigma\) is the mean-squared fluctuation of the pointer. For the corresponding increase of the entropy \(\Delta S\), one has \((1')\) \(\Delta S>{1\over 2}k\alpha^2\).

L. Brillouin [Science and Information Theory, Academic Press, New York (1956; Zbl 0071.13104)] made an attempt to formulate a new general principle which establishes the increase of entropy in the process of obtaining information. His principle of information negentropy claims that the growth of entropy is not less than the amount of information obtained. The amount of information is understood here in Shannon’s sense. Probably, the information negentropy principle holds qualitatively. But it is doubtful that this principle holds quantitatively in all cases, since it is beyond reason to believe that only Shannon’s information is of importance in all information processes. For example, in the case considered, the inequality \((1')\) implies that here we deal with Fisher’s rather than Shannon’s information.

The author establishes the following inequality, which associates the relative measurement precision with the energy dissipation: (1) \(Q>{1\over 2}kT\alpha^2\), \(\alpha=\Delta/\sigma\), where \(Q\) is the amount of the dissipated energy, \(\alpha\) is the relative measurement precision, \(T\) is the temperature, \(k\) is Boltzmann’s constant, \(\Delta\) is the pointer movement, and \(\sigma\) is the mean-squared fluctuation of the pointer. For the corresponding increase of the entropy \(\Delta S\), one has \((1')\) \(\Delta S>{1\over 2}k\alpha^2\).

L. Brillouin [Science and Information Theory, Academic Press, New York (1956; Zbl 0071.13104)] made an attempt to formulate a new general principle which establishes the increase of entropy in the process of obtaining information. His principle of information negentropy claims that the growth of entropy is not less than the amount of information obtained. The amount of information is understood here in Shannon’s sense. Probably, the information negentropy principle holds qualitatively. But it is doubtful that this principle holds quantitatively in all cases, since it is beyond reason to believe that only Shannon’s information is of importance in all information processes. For example, in the case considered, the inequality \((1')\) implies that here we deal with Fisher’s rather than Shannon’s information.

##### MSC:

94A17 | Measures of information, entropy |

00A79 | Physics (Use more specific entries from Sections 70-XX through 86-XX when possible) |

81P15 | Quantum measurement theory, state operations, state preparations |