*(English)*Zbl 0972.03002

Uncertainty has been traditionally described by probability theory, in which our uncertainty in a statement is described by a probability – a number from the interval $[0,1]$. After probability theory, several other formalisms have appeared that describe degrees of uncertainty, vagueness, etc. Most of these formalisms also use values from the interval $[0,1]$ to describe the corresponding degrees. In each of these formalisms, we encounter a similar problem: we know the degree of uncertainty (belief, etc.) $a=d\left(A\right)$ of a statement $A$, we know the degree of uncertainty $b=d\left(B\right)$ of a statement $B$, we do not have any additional information about $A$ and $B$, and we want to estimate the degree $d(A\&B)$. The only information that we can use in estimating this degree are the two numbers $a$ and $b$, hence this estimate is a function $T(a,b)$ of $a$ and $b$.

What are the natural properties of this function $T:[0,1]\times [0,1]\to [0,1]$? Since the statements $A\&B$ and $B\&A$ are equivalent, it is natural to require that the corresponding estimates $T(a,b)$ and $T(b,a)$ coincide, i.e., that the operation $T$ is commutative. Similarly, from the fact that $A\&(B\&C)$ and $(A\&B)\&C$ mean the same thing, we conclude that the operation $T$ must be associative. If our degree of believe in $A$ increases, then the degree of belief in $A\&B$ should either increase or stay the same; thus, $T$ must be monotonic. Finally, if we are absolutely sure about $A$, i.e., if $a=1$, then our degree of belief in $A\&B$ should be equal to the degree of belief in $B$, i.e., $T(1,b)=b$. A function $T$ satisfying these four properties is called a triangular norm (or a t-norm, for short).

This name comes from K. Menger who, in 1942, introduced such functions in the context of probabilistic metric spaces, in which the distance between two points is a random variable. To formulate the probabilistic version of the triangle inequality $\rho (x,z)\le \rho (x,y)+\rho (y,z)$, he needed to estimate the probability $P\left(\rho \right(x,y)\le a\phantom{\rule{4.pt}{0ex}}\text{and}\phantom{\rule{4.pt}{0ex}}\rho (y,z)\le b)$ based on the probabilities $P\left(\rho \right(x,y)\le a)$ and $P\left(\rho \right(y,z)\le b)$ of these two inequalities. Since then, t-norms have been successfully used in probability theory – to estimate, for random variables $X$ and $Y$, the probability $P(X\le x\&Y\le y)$ based on the probabilities $P(X\le x)$ and $P(Y\le y)$ –, in fuzzy logic, etc.

In addition to the necessity of estimating the degree of belief in $A\&B$, we have a similar problem of estimating the degree of belief in $A\vee B$. This problem leads to the similarly general notion of t-conorm $S(a,b)$. From the mathematical viewpoint, there is a 1-1 correspondence between t-norms and t-conorms: if $T$ is a t-norm, then $1-T(1-a,1-b)$ is a t-conorm, and vice versa. Thus, the mathematical description of t-norms (“and”-operations) helps us to understand “or”-operations as well.

The practical usefulness of t-norms encouraged their theoretical analysis. Two areas of mathematics contributed the most to this analysis: first, since all requirements on the t-norms are functional equations, general functional equation theory has been used; second, since one of these conditions is associativity, a t-norm is a semigroup operation, so semigroup theory was used as well.

The book under review provides an encyclopedic overview of practically all known theoretical results about t-norms, with proofs of almost all these results (the only exception is the known result about Frank’s t-norms which is presented without proof). The authors clearly spent quite some time making many of these proofs much clearer and pedagogically better described than in the original papers. The book also provides an overview of various applications of t-norms. Many of these applications lead to other interesting theorems; the scope of these applications is so wide and the methods so different that the authors wisely decided to present most of these results without reproducing the proofs.

One of the main results in t-norm theory is the 1960s classification theorem for continuous t-norms. This result represents every t-norm as an “ordinal sum” of operations isomorphic to three standard ones: the product $a\xb7b$, $max(a+b-1,0)$, and $min(a,b)$. At first glance, it may seem that this result largely closed this field of study. Interestingly, the authors describe many new results (many of them proven by the authors themselves) that really increase our knowledge about t-norms. These results include: axiomatic characterization of different known classes of t-norms; an approximation result according to which every t-norm can be approximated by strict Archimedean ones (ones which are isomorphic to $a\xb7b$); results on comparisons of t-norms; and results on the possibility to uniquely reconstruct a t-norm $T(a,b)$ from its values on some pairs $(a,b)$.

An interesting part of the book is related to the fact that t-norms comes from “and”, but logic has connectives besides “and”. The relation between t-norms and “or” is straightforward. Relation to operations corresponding to “not” and “implies” is less direct – and thus mathematically more interesting. The authors overview the corresponding results.

The authors also overview different useful generalizations of t-norms: generalization to sets different from $[0,1]$, from discrete to general lattices; more general associative operations on $[0,1]$; and, finally, non-associative operations on $[0,1]$. When an operation is non-associative, the transition from a binary to an $n$-ary operation becomes non-trivial; thus, the authors overview several $n$-ary operations similar to t-norms.

This book is perfectly written, includes almost all known results. It is a must for all researchers using t-norms, be it in statistics or in fuzzy studies or wherever. This book is a must for people who use t-norms and want to know more about them, but even those who are already specialists in t-norms will find a lot of new results and – undoubtedly – a lot of new applications.