Necessary definitions and notations are introduced in section 2. Probability functions are represented in exponential form and by summation, as two possible alternatives. Side conditions permit to identify the interaction function of a set A of subindexes i in the set of m discrete random variables. Then the dependence graph D on a vertex set M is introduced.
Using these results, the probability function of a random graph G is characterized in Theorem 1. It is a log-linear model. Sufficient statistics are fixed and they indicate the ”sufficient subgraphs of G. Two examples illustrate the concepts and results. Theorem 2 gives the log-linear model and the sufficient statistics for a Markov graph and Theorem 3 for an undirected and homogeneous Markov graph. Inference for Markov graphs is studied and the difficulties for obtaining maximum likelihood estimators are discussed. The use of a Metropolis procedure, say a thermodynamic simulation method, permits to obtain the maximum.
The logit regression is also proposed and maximum likelihood estimators are derived for generated random graphs. Random directed graphs with a certain structure are characterized by fixing their probabilistic structure. The corresponding model is a log-linear one. Also, Markov graphs under directed graphs and Markov homogeneous directed graph are studied. The characterizations are given in Theorems 4, 5 and 6.
Various graphics and tables illustrate the results. Hammersley-Clifford theorem is used for describing the dependence structures related with the different cases.