## Bayesian and non-Bayesian evidential updating.(English)Zbl 0622.68069

Orthodox probability theory supposes that (1) we commence with known statistical distributions, (2) these distributions give rise to real- valued probabilities, and (3) these probabilities can be updated by the use of Bayes’ theorem. Each of these suppositions has been challenged by recent work in computer science concerning the representation and updating of partial belief. One approach that has found acceptance is the theory of belief functions of Shafer, and the corresponding procedure of updating. This article addresses the relation between Shafer/Dempster theory and a very slight extension of classical Bayesian theory.
There are four main results (not all of which are new): (1) Closed convex sets of classical probability functions provide a representation of belief that includes the theory of belief functions as a proper special case. (2) The impact of ”uncertain evidence” can be formally represented by Dempster conditioning in Shafer’s framework. (3) The impact of ”uncertain evidence” (Jeffrey conditioning) can be formally represented in the convex set framework by classical conditioning. (4) The main theorem: Suppose a distribution of beliefs is given by both $$Bel_ 1$$ and the prior set of probability distributions $$S_ P$$. Suppose that new evidence is obtained whose impact is given by the simple support function $$Bel_ A$$, or by a corresponding shift in the probability of A on each of the distributions in $$S_ P$$ (it will not be the same shift for each distribution). Let $$S_{PA}$$ be the set of probability functions obtained by updating each member of $$S_ P$$ according to Jeffrey’s rule. Then $\min_{P\in S_{PA}}P(X)\leq [Bel_ 1\oplus Bel_ A](X)\leq 1- [Bel_ 1\oplus Bel_ A](X)\leq \max_{P\in S_{PA}}P(X).$ Equality holds only in certain very special cases.

### MSC:

 68T05 Learning and adaptive systems in artificial intelligence 62A01 Foundations and philosophical topics in statistics 60A05 Axioms; other general questions in probability

DENDRAL
Full Text:

### References:

 [1] Barnett, J.A., Computational methods for a mathetmatical theory of evidence, (), 868-875 [2] Buchanan, B.G.; Sutherland, G.L.; Feigenbaum, E.A., Heuristic dendral: A program for generating explanatory hypotheses in organic chemistry, () · Zbl 0217.54302 [3] Carnap, R., () [4] Dempster, A.P., Upper and lower probabilities induced by a multi-valued mapping, Ann. math. stat., 38, 325-339, (1967) · Zbl 0168.17501 [5] Dempster, A.P., A generalization of Bayesian inference, J. roy. stat. soc. B, 30, 205-247, (1968) · Zbl 0169.21301 [6] Diaconis, P.; Zabell, S., Updating subjective probabilities, J. am. stat. assoc., 7, 822-830, (1982) · Zbl 0504.62004 [7] Dillard, R.A., The Dempster-Shafer theory applied to tactical fusion in an inference system, () [8] Duda, R.O.; Hart, P.E.; Nilsson, N.J., Subjective Bayesian methods for rule-based inference systems, (), 1075-1082 [9] Field, H., A note on Jeffrey conditionalization, Philos. sci., 45, 361-367, (1978) [10] Garvey, R.D.; Lowrance, J.D.; Fishler, M.A., An inference technique for integrating knowledge from disparate sources, (), 319-325 [11] Good, I.J., Subjective probability as a measure of a non-measurable set, () · Zbl 0192.02104 [12] Jeffrey, R., () [13] Keynes, J.M., () [14] Kim, J.; Pearl, J., A computational model for causal and diagnostic reasoning in inference systems, (), 190-193 [15] Koopman, B.O., The bases of probability, Bull. am. math. soc., 46, 763-774, (1940) · Zbl 0024.05002 [16] Koopman, B.O., The axioms and algebra of intuitive probability, Ann. math., 41, 269-292, (1940) · Zbl 0024.05001 [17] Koopman, B.O., Intuitive probabilities and sequences, Ann. math., 42, 169-187, (1941) · Zbl 0024.24101 [18] Kyburg, H.E., () [19] Kyburg, H.E., () [20] Levi, I., On indeterminate probabilities, J. philos., 71, 391-418, (1974) [21] Levi, I., Probability kinematics, British J. philos. sci., 18, 197-209, (1967) [22] Levi, I., Dissonance and consistency according to shackle and Shafer, (), 466-477 [23] Levi, I., Potential surprise: its role in inference and decision making, (), 1-27 [24] Levi, I., () [25] Levi, I., Consonance, dissonance and evidentiary mechanism, (), 27-43 [26] Lowrance, E., Dependency-graph models of evidential support, () [27] Lowrance, J.D.; Garvey, T.D., Evidential reasoning: A developing concept, (), 6-9 [28] Mackie, J.S., The relevance criterion of confirmation, British J. philos. sci., 20, 27-40, (1969) · Zbl 0233.02002 [29] Quinlan, J.R., () [30] Savage, L.J., () [31] Seidenfeld, T., Statistical evidence and belief functions, (), 451-465 [32] Shafer, G., () [33] Shafer, G., Jeffrey’s rule of conditioning, Philos. sci., 48, 337-362, (1981) [34] Shortliffe, E.H., () [35] Smith, C.A.B., Consistency in statistical inference and decision, J. roy. stat. soc. B, 23, 1-37, (1961) · Zbl 0124.09603 [36] Smith, C.A.B., Personal probability and statistical analysis, J. roy. stat. soc. A, 128, 469-499, (1965) [37] Wesley, L.P.; Hanson, A.R., The use of an evidential based model for representing knowledge and reasoning about images in the vision system, (), 14-25
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.