Non-commutative cryptography and complexity of group-theoretic problems. With an appendix by Natalia Mosina.

*(English)*Zbl 1248.94006
Mathematical Surveys and Monographs 177. Providence, RI: American Mathematical Society (AMS) (ISBN 978-0-8218-5360-3/hbk). xiv, 385 p. (2011).

This is an extensive exposition of cryptographic methods based on non-commutative groups. The two major themes in the book are cryptography and cryptanalysis based on non-commutative groups and generic complexity.

The first part is devoted to groups, complexity and cryptography. Initially, the basic public-key protocols for ciphering and authentication are presented as well as the combinatorial group notions and important problems, such as word, conjugacy, decomposition, membership and isomorphism problems. For instance, Tietze’s method to recognize isomorphisms based on Nielsen’s transformation rules is discussed. The basics of complexity are also introduced here, namely, deterministic, nondeterministic and probabilistic Turing machines, decision and search problems, reductions among problems and the class NP.

The second part is devoted to non-commutative cryptography, which is based on one-way maps, practically given as solving procedures of hard problems in non-commutative groups. This part is an illustrative exposition of a protocol variety. Initially, several protocols for key agreement are presented. Then some platform groups are introduced, mainly braid groups, small cancellation groups, solvable groups, Thomson group, Artin groups and Grigorchuk group. These are representative groups with rapid growth and efficiently solvable – let us say in polynomial time – word problems. Within this context, reduction to normal forms is discussed. Also, authentication protocols are reviewed. The authors show several zero-knowledge protocols and no-leaks protocols based on hard problems over non-commutative groups for authentication purposes. No-leaks means in this context that interaction with a prover and guessing without any interaction both produce the same effect for any verifier.

The third part deals with generic complexity and cryptanalysis. Average-case complexity deals mainly with the behavior of algorithms, i.e., expected computation time of the solving algorithms. Instead, generic complexity considers probability distributions on the space of inputs and the probability to find “really hard” instances for the solving algorithms. Typically, a problem is posed as a domain \(I\) provided with a size map \(I\rightarrow\mathbb{N}\) which partitions the domain as a collection of spheres \(\left(I_n\right)_{n\in\mathbb{N}}\), where \(I_n=s^{-1}(n)\). Any probability distribution on the domain \(I\) determines a probability distribution ensemble over the family \(\left(I_n\right)_{n\in\mathbb{N}}\), and conversely any probability distribution ensemble determines a probability distribution on the domain. A real map \(f\) defined over \(I\) is polynomial in expectation if the map associating to each \(n\) the expected value of \(f\) over the \(n\)-th sphere with respect to the local probability distribution is polynomially bounded, and an algorithm is polynomial in expectation if so is its time function. The authors show that some NP-complete problems, e.g., Hamiltonian Circuit, are polynomial in expectation, and some criteria to decide NP-completeness. A generic set is a set of probability 1 and a negligible set is the complement of a generic set. For an ensemble of distributions \(\left(\mu_n\right)_{n\in\mathbb{N}}\), each \(\mu_n\) defined on the sphere \(I_n\), the asymptotic density is the map \(R\mapsto\lim_{n\to+\infty}\mu_n(R\cap I_n)\). A generic set is qualified in accordance with the convergence rate of this limit. The strongly generic sets are the superpolynomial generic sets. A generic upper bound of an algorithm \(A\) is a map \(f\) such that the set \(H_{Af}\) of instances \(x\in I\) satisfying \(T_A(x)\leq f\circ s(x)\) is generic, where \(s\) is a size map and \(T_A\) is the time map of \(A\). A generic bound is qualified with the same qualification as the generic set \(H_{Af}\). Naturally, a problem is generically decidable in polynomial time if it has a solving algorithm with a generic upper bound which is polynomial. Generic complexity deals thus with the distribution of the inputs. It is interesting to see that there are not just intractable, but even undecidable problems, for instance the halting problem and Post’s correspondence problem, with polynomial generic complexity. On the other hand, a problem \(D\) is strongly undecidable if for no strongly generic set \(J\) of instances, \(D\) can be decided generically within \(J\). The halting problem and the Rice theorem pose examples of strongly undecidable problems. The halting problem is thus generically decidable and strongly undecidable, however these assertions depend on the chosen computational model. The authors show an amplification procedure to build strongly undecidable versions of undecidable problems.

The fourth part deals with asymptotically dominant properties and cryptanalysis. Modulo a coding of groups, the collection of groups can be realized as a language on the semigroup of strings over a fixed alphabet. The authors consider an ensemble of distributions \(\left(\mu_n\right)_{n\in\mathbb{N}}\) defined on a stratification \(\left(I_n\right)_{n\in\mathbb{N}}\) consisting of spheres with respect to a size function. Then a group property can be generic, strongly generic or exponentially generic in accordance with its image under the given group coding. For instance, the property FB of being a set of generators of a free subgroup within a finitely generated group of fixed rank is exponentially generic, and the same holds in partially commutative non-abelian groups. The authors show how several key agreement protocols can be reduced to the simultaneous conjugacy problems and, based on these results, that there may be exponentially generic sets of instances in which the protocols may be broken. The outlined attacks are based on word lengths and on geodesics (shortest distances in Cayley graphs).

The fifth part is devoted to word and conjugacy search problems in groups. Some techniques are presented through Stallings foldings and van Kampen diagrams to solve the word search problem. A complete analysis of the Todd-Coxeter algorithm from the point of view of generic complexity is realized. The conjugacy problem is treated through Schupp annular diagrams and pseudo-conjugacy graphs, and asymptotic generic properties of random conjugated words are presented.

The sixth part deals with the word problem in special classes of groups, such as free solvable groups, introducing notions such as Magnus embeddings and Fox derivatives, and in automorphism groups of free groups through a “programming approach” in which programs are formal rewriting systems. Here Plandowski’s algorithm to decide whether two straight-line programs are equivalent is developed in full detail.

The book closes with an appendix of Natalia Mosina presenting the probabilistic cryptanalysis of group-based cryptography. It is a self-contained exposition aiming at an extensive analysis of probabilistic cryptanalysis of Silbert-type protocols.

The book is a very interesting exposition of cryptography based on non-commutative groups. Nevertheless, a more practical approach would have been desirable. There is a lack of exercises, and some implementation hints of the great variety of exposed protocols would have been helpful. The text is assembled from several research papers published by the authors, and the reader can detect some style changes during the exposition. This also has provoked some repetitions (e.g., Sections 12.1 and 13.1 coincide). But altogether, the book is certainly a highly recommendable text for an advanced graduate course on cryptography.

The first part is devoted to groups, complexity and cryptography. Initially, the basic public-key protocols for ciphering and authentication are presented as well as the combinatorial group notions and important problems, such as word, conjugacy, decomposition, membership and isomorphism problems. For instance, Tietze’s method to recognize isomorphisms based on Nielsen’s transformation rules is discussed. The basics of complexity are also introduced here, namely, deterministic, nondeterministic and probabilistic Turing machines, decision and search problems, reductions among problems and the class NP.

The second part is devoted to non-commutative cryptography, which is based on one-way maps, practically given as solving procedures of hard problems in non-commutative groups. This part is an illustrative exposition of a protocol variety. Initially, several protocols for key agreement are presented. Then some platform groups are introduced, mainly braid groups, small cancellation groups, solvable groups, Thomson group, Artin groups and Grigorchuk group. These are representative groups with rapid growth and efficiently solvable – let us say in polynomial time – word problems. Within this context, reduction to normal forms is discussed. Also, authentication protocols are reviewed. The authors show several zero-knowledge protocols and no-leaks protocols based on hard problems over non-commutative groups for authentication purposes. No-leaks means in this context that interaction with a prover and guessing without any interaction both produce the same effect for any verifier.

The third part deals with generic complexity and cryptanalysis. Average-case complexity deals mainly with the behavior of algorithms, i.e., expected computation time of the solving algorithms. Instead, generic complexity considers probability distributions on the space of inputs and the probability to find “really hard” instances for the solving algorithms. Typically, a problem is posed as a domain \(I\) provided with a size map \(I\rightarrow\mathbb{N}\) which partitions the domain as a collection of spheres \(\left(I_n\right)_{n\in\mathbb{N}}\), where \(I_n=s^{-1}(n)\). Any probability distribution on the domain \(I\) determines a probability distribution ensemble over the family \(\left(I_n\right)_{n\in\mathbb{N}}\), and conversely any probability distribution ensemble determines a probability distribution on the domain. A real map \(f\) defined over \(I\) is polynomial in expectation if the map associating to each \(n\) the expected value of \(f\) over the \(n\)-th sphere with respect to the local probability distribution is polynomially bounded, and an algorithm is polynomial in expectation if so is its time function. The authors show that some NP-complete problems, e.g., Hamiltonian Circuit, are polynomial in expectation, and some criteria to decide NP-completeness. A generic set is a set of probability 1 and a negligible set is the complement of a generic set. For an ensemble of distributions \(\left(\mu_n\right)_{n\in\mathbb{N}}\), each \(\mu_n\) defined on the sphere \(I_n\), the asymptotic density is the map \(R\mapsto\lim_{n\to+\infty}\mu_n(R\cap I_n)\). A generic set is qualified in accordance with the convergence rate of this limit. The strongly generic sets are the superpolynomial generic sets. A generic upper bound of an algorithm \(A\) is a map \(f\) such that the set \(H_{Af}\) of instances \(x\in I\) satisfying \(T_A(x)\leq f\circ s(x)\) is generic, where \(s\) is a size map and \(T_A\) is the time map of \(A\). A generic bound is qualified with the same qualification as the generic set \(H_{Af}\). Naturally, a problem is generically decidable in polynomial time if it has a solving algorithm with a generic upper bound which is polynomial. Generic complexity deals thus with the distribution of the inputs. It is interesting to see that there are not just intractable, but even undecidable problems, for instance the halting problem and Post’s correspondence problem, with polynomial generic complexity. On the other hand, a problem \(D\) is strongly undecidable if for no strongly generic set \(J\) of instances, \(D\) can be decided generically within \(J\). The halting problem and the Rice theorem pose examples of strongly undecidable problems. The halting problem is thus generically decidable and strongly undecidable, however these assertions depend on the chosen computational model. The authors show an amplification procedure to build strongly undecidable versions of undecidable problems.

The fourth part deals with asymptotically dominant properties and cryptanalysis. Modulo a coding of groups, the collection of groups can be realized as a language on the semigroup of strings over a fixed alphabet. The authors consider an ensemble of distributions \(\left(\mu_n\right)_{n\in\mathbb{N}}\) defined on a stratification \(\left(I_n\right)_{n\in\mathbb{N}}\) consisting of spheres with respect to a size function. Then a group property can be generic, strongly generic or exponentially generic in accordance with its image under the given group coding. For instance, the property FB of being a set of generators of a free subgroup within a finitely generated group of fixed rank is exponentially generic, and the same holds in partially commutative non-abelian groups. The authors show how several key agreement protocols can be reduced to the simultaneous conjugacy problems and, based on these results, that there may be exponentially generic sets of instances in which the protocols may be broken. The outlined attacks are based on word lengths and on geodesics (shortest distances in Cayley graphs).

The fifth part is devoted to word and conjugacy search problems in groups. Some techniques are presented through Stallings foldings and van Kampen diagrams to solve the word search problem. A complete analysis of the Todd-Coxeter algorithm from the point of view of generic complexity is realized. The conjugacy problem is treated through Schupp annular diagrams and pseudo-conjugacy graphs, and asymptotic generic properties of random conjugated words are presented.

The sixth part deals with the word problem in special classes of groups, such as free solvable groups, introducing notions such as Magnus embeddings and Fox derivatives, and in automorphism groups of free groups through a “programming approach” in which programs are formal rewriting systems. Here Plandowski’s algorithm to decide whether two straight-line programs are equivalent is developed in full detail.

The book closes with an appendix of Natalia Mosina presenting the probabilistic cryptanalysis of group-based cryptography. It is a self-contained exposition aiming at an extensive analysis of probabilistic cryptanalysis of Silbert-type protocols.

The book is a very interesting exposition of cryptography based on non-commutative groups. Nevertheless, a more practical approach would have been desirable. There is a lack of exercises, and some implementation hints of the great variety of exposed protocols would have been helpful. The text is assembled from several research papers published by the authors, and the reader can detect some style changes during the exposition. This also has provoked some repetitions (e.g., Sections 12.1 and 13.1 coincide). But altogether, the book is certainly a highly recommendable text for an advanced graduate course on cryptography.

Reviewer: Guillermo Morales-Luna (Mexico)

##### MSC:

94-02 | Research exposition (monographs, survey articles) pertaining to information and communication theory |

94A60 | Cryptography |

94A62 | Authentication, digital signatures and secret sharing |

03D35 | Undecidability and degrees of sets of sentences |

11T71 | Algebraic coding theory; cryptography (number-theoretic aspects) |

20F10 | Word problems, other decision problems, connections with logic and automata (group-theoretic aspects) |

68Q25 | Analysis of algorithms and problem complexity |