##
**Quantum information theory and quantum statistics.**
*(English)*
Zbl 1145.81002

Theoretical and Mathematical Physics (Cham). Berlin: Springer (ISBN 978-3-540-74634-8/hbk). ix, 214 p. (2008).

This book covers a great deal of central topics and actual problems of quantum information theory except quantum computation and cryptography. It is an introduction which emphasizes the mathematical interesting and beauty aspects of quantum and information theory as well as their synthesis on an ambitious level suitable for graduate students. It is offering mathematically rigorous and elegant proofs for central propositions and with this respect it can serve teachers and researchers doing their work. Every chapter concludes by bibliographical notes and a collection of exercises.

Chapter two (pp. 3–24), entitled “Prerequisites of Quantum Mechanics”, contains concise explanations of just those subjects permanently needed: States, state decompositions, selective operations, partial traces, general state transformations. In Chapter three (pp. 25–51), entitled “Information and its Measures”, the Shannon entropy is introduced and its use is illustrated by classical source coding and data compression theorems. After introducing the von Neumann entropy the relative entropy and its quantum analogue are considered. Its monotonicity, i.e. that the quantum relative entropy does not increase when the arguments are subjected to the same state operation (here called general state transformation) is proved by the relative modular operator method due to the author which alternates the proof given by A. Uhlmann. Several important properties and propositions are derived. Further properties follow from the quantum analogue of the Rényi entropy. Klein’s inequality can be found in the appendix of the book.

The Schmidt decomposition, entanglement, teleportation and dense coding is subject of chapter four (pp. 53–82), entitled “Entanglement“. The characterization of maximal entanglement for bipartite pure states is followed by several criteria for entanglement of mixed states. Teleportation is explained for the standard and the general case. Besides the entanglement entropy the entanglement of formation and another measure, called squashed entanglement are considered.

The fifth chapter (pp. 73–82) introduces the mutual entropy and its quantum analogue. Markov chains are related to the entropy of joint distributions of random variables. The von Neumann entropy of particle systems, the strong subadditivity, and the Holevo bound are explained. In the final section the entropy exchange concerning the reaction of a quantum system with the environment is considered.

Quantum data compression is the topic of the sixth chapter (pp. 83–90). Here, instead of the rate distortion, \(1 - F\), \(F\) being the fidelity, is used. Reliable quantum data compression is introduced and Schumacher’s source coding theorem is proved. A further theorem states the existence of the high probability subspace working universally for all states with von Neumann entropy smaller than a given number.

Chapter seven (pp. 91–107), entitled “Channels and their capacity”, begins by comparing classical and quantum channels. The Shannon capacity and Shannon’s noisy channel coding theorem are described. For quantum channels the Holevo quantity, also called quantum mutual information, and the Holevo capacity, are considered, and it is mentioned, that the classical mutual entropy for transmission of classical information by a quantum channel, is bounded by the Holevo quantity. In order to estimate the Holevo quantity, the author introduces the relative entropy center and the exact relative entropy radius of a family of states. Examples are considered, further concepts are introduced and several theorems are proved.

The hypothesis testing, already mentioned in chapter 3, is the subject of chapter eight (pp. 109–120). Here the quantum Stein lemma, which links the hypothesis error probabilities asymptotically to the quantum relative entropy, is proved. In a second part of this chapter the classical Chernoff bound is generalized to the quantum case and and a quantum analogue to the classical Chernoff theorem is proved.

Coarse graining, the topic of chapter nine (pp. 121–142), is introduced as a positive linear embedding of an algebra of observables into another one. After considering important examples and introducing the reduction of density operators w.r.t. a coarse graining, conditional expectations are defined. The relation of conditional expectations to state operations is explained. Takesaki’s theorem, which concerns the existence of a conditional expectation preserving a given density operator under reduction, is proved. Another result is the equation \(S(\sigma\|\rho)-S(\sigma_0\|\rho_0)=S(\sigma\|\rho\circ E)\), where \(\rho_0, \sigma_0\) are the reduced states of \(\rho,\sigma\) w.r.t. the conditional expectation \(E\), respectively. Commuting diagrams for compositions of conditional expectations and a superadditivity property of the relative entropy are proved. Sufficiency criteria for the description of given statistical experiments by coarse grained observables are considered. The final section of the chapter concerns Accardi’s concept of Markov states.

Chapter ten (pp. 143–164) concerns state estimation. It begins introducing estimation schemes. Such a scheme consists in a series of pairs \((F_n,\Phi_n)\), where \(F_n\) is a generalized observable on \({\mathcal H}^{\otimes n}\) and \(\Phi_n\) is a function which maps the outcome of a measurement of \(F_n\) onto a density operator on \(\mathcal H\). It is called unbiased, if the expectation value of \(\Phi_n\) for measurements on \(\rho^{\otimes n}\) is \(\rho\) for an arbitrary density operator, and consistent if this holds true at least asymptotically for \(n \rightarrow \infty\). Seven examples illustrate such schemes. For unbiased schemes, the Cramér-Rao inequality is generalized to the quantum case. The lower bound of the latter inequality is called the quantum Fisher information. Considering the quantum Fisher information and coarse graining, a Cramér-Rao inequality is also derived for biased estimation schemes. In a special case the quantum Fisher information is related to the Wigner-Yanase skew information.

The final chapter eleven (pp. 165–203), entitled “Appendix: Auxiliary linear and convex analysis”, is a very useful collection of mathematical tools for quantum informational investigations.

Chapter two (pp. 3–24), entitled “Prerequisites of Quantum Mechanics”, contains concise explanations of just those subjects permanently needed: States, state decompositions, selective operations, partial traces, general state transformations. In Chapter three (pp. 25–51), entitled “Information and its Measures”, the Shannon entropy is introduced and its use is illustrated by classical source coding and data compression theorems. After introducing the von Neumann entropy the relative entropy and its quantum analogue are considered. Its monotonicity, i.e. that the quantum relative entropy does not increase when the arguments are subjected to the same state operation (here called general state transformation) is proved by the relative modular operator method due to the author which alternates the proof given by A. Uhlmann. Several important properties and propositions are derived. Further properties follow from the quantum analogue of the Rényi entropy. Klein’s inequality can be found in the appendix of the book.

The Schmidt decomposition, entanglement, teleportation and dense coding is subject of chapter four (pp. 53–82), entitled “Entanglement“. The characterization of maximal entanglement for bipartite pure states is followed by several criteria for entanglement of mixed states. Teleportation is explained for the standard and the general case. Besides the entanglement entropy the entanglement of formation and another measure, called squashed entanglement are considered.

The fifth chapter (pp. 73–82) introduces the mutual entropy and its quantum analogue. Markov chains are related to the entropy of joint distributions of random variables. The von Neumann entropy of particle systems, the strong subadditivity, and the Holevo bound are explained. In the final section the entropy exchange concerning the reaction of a quantum system with the environment is considered.

Quantum data compression is the topic of the sixth chapter (pp. 83–90). Here, instead of the rate distortion, \(1 - F\), \(F\) being the fidelity, is used. Reliable quantum data compression is introduced and Schumacher’s source coding theorem is proved. A further theorem states the existence of the high probability subspace working universally for all states with von Neumann entropy smaller than a given number.

Chapter seven (pp. 91–107), entitled “Channels and their capacity”, begins by comparing classical and quantum channels. The Shannon capacity and Shannon’s noisy channel coding theorem are described. For quantum channels the Holevo quantity, also called quantum mutual information, and the Holevo capacity, are considered, and it is mentioned, that the classical mutual entropy for transmission of classical information by a quantum channel, is bounded by the Holevo quantity. In order to estimate the Holevo quantity, the author introduces the relative entropy center and the exact relative entropy radius of a family of states. Examples are considered, further concepts are introduced and several theorems are proved.

The hypothesis testing, already mentioned in chapter 3, is the subject of chapter eight (pp. 109–120). Here the quantum Stein lemma, which links the hypothesis error probabilities asymptotically to the quantum relative entropy, is proved. In a second part of this chapter the classical Chernoff bound is generalized to the quantum case and and a quantum analogue to the classical Chernoff theorem is proved.

Coarse graining, the topic of chapter nine (pp. 121–142), is introduced as a positive linear embedding of an algebra of observables into another one. After considering important examples and introducing the reduction of density operators w.r.t. a coarse graining, conditional expectations are defined. The relation of conditional expectations to state operations is explained. Takesaki’s theorem, which concerns the existence of a conditional expectation preserving a given density operator under reduction, is proved. Another result is the equation \(S(\sigma\|\rho)-S(\sigma_0\|\rho_0)=S(\sigma\|\rho\circ E)\), where \(\rho_0, \sigma_0\) are the reduced states of \(\rho,\sigma\) w.r.t. the conditional expectation \(E\), respectively. Commuting diagrams for compositions of conditional expectations and a superadditivity property of the relative entropy are proved. Sufficiency criteria for the description of given statistical experiments by coarse grained observables are considered. The final section of the chapter concerns Accardi’s concept of Markov states.

Chapter ten (pp. 143–164) concerns state estimation. It begins introducing estimation schemes. Such a scheme consists in a series of pairs \((F_n,\Phi_n)\), where \(F_n\) is a generalized observable on \({\mathcal H}^{\otimes n}\) and \(\Phi_n\) is a function which maps the outcome of a measurement of \(F_n\) onto a density operator on \(\mathcal H\). It is called unbiased, if the expectation value of \(\Phi_n\) for measurements on \(\rho^{\otimes n}\) is \(\rho\) for an arbitrary density operator, and consistent if this holds true at least asymptotically for \(n \rightarrow \infty\). Seven examples illustrate such schemes. For unbiased schemes, the Cramér-Rao inequality is generalized to the quantum case. The lower bound of the latter inequality is called the quantum Fisher information. Considering the quantum Fisher information and coarse graining, a Cramér-Rao inequality is also derived for biased estimation schemes. In a special case the quantum Fisher information is related to the Wigner-Yanase skew information.

The final chapter eleven (pp. 165–203), entitled “Appendix: Auxiliary linear and convex analysis”, is a very useful collection of mathematical tools for quantum informational investigations.

Reviewer: K.-E. Hellwig (Berlin)

### MSC:

81-01 | Introductory exposition (textbooks, tutorial papers, etc.) pertaining to quantum theory |

81P68 | Quantum computation |

81P15 | Quantum measurement theory, state operations, state preparations |

81-02 | Research exposition (monographs, survey articles) pertaining to quantum theory |

94A05 | Communication theory |

94A17 | Measures of information, entropy |

94A24 | Coding theorems (Shannon theory) |

94A29 | Source coding |

94A34 | Rate-distortion theory in information and communication theory |

94A40 | Channel models (including quantum) in information and communication theory |

62F03 | Parametric hypothesis testing |

62F12 | Asymptotic properties of parametric estimators |

62F15 | Bayesian inference |