×

zbMATH — the first resource for mathematics

Computational learning of the conditional phase-type (C-Ph) distribution. Learning C-Ph distributions. (English) Zbl 1371.68236
Summary: This paper presents a new algorithm for learning the structure of a special type of Bayesian network. The conditional phase-type (C-Ph) distribution is a Bayesian network that models the probabilistic causal relationships between a skewed continuous variable, modelled by the Coxian phase-type distribution, a special type of Markov model, and a set of interacting discrete variables. The algorithm takes a data set as input and produces the structure, parameters and graphical representations of the fit of the C-Ph distribution as output. The algorithm, which uses a greedy-search technique and has been implemented in MATLAB, is evaluated using a simulated data set consisting of 20,000 cases. The results show that the original C-Ph distribution is recaptured and the fit of the network to the data is discussed.
MSC:
68T05 Learning and adaptive systems in artificial intelligence
62F15 Bayesian inference
Software:
AutoClass; BNT; Matlab
PDF BibTeX XML Cite
Full Text: DOI
References:
[1] Aalen O (1995) Phase type distributions in survival analysis. Scand J Stat 22:447–463 · Zbl 0836.62095
[2] Assmussen S, Nerman O, Olsson S (1996) Fitting phase-type distributions via the EM algorithm. Scand J Stat 23:419–441 · Zbl 0898.62104
[3] Bayesware (2000) www.bayesware.com
[4] Buntine W (1996) A guide to the literature on learning probabilistic networks from data. IEEE Trans Knowl Data Eng 8(2):195–210 · Zbl 05109019 · doi:10.1109/69.494161
[5] Castillo E, Gutierrez JM, Hadi AS (1997) Expert systems and probabilistic network models. Springer, New York
[6] Charniak E (1991) Bayesain networks without tears. AI Mag 12(4):50–63
[7] Cheng J, Bell D, Liw W (1997) An algorithm for Bayesian network construction from data. In: Proceedings of AI and STAT ’97, Lauderdale, Florida, pp 83–90
[8] Chickering DM (1996) Learning Bayesian netwoks is NP-hard. In: Fisher D, Lenz H (eds) Learning from data: artifical intelligence and statistics V. Springer, Berlin, pp 121–130
[9] Cooper G, Herskovits E (1992) A Bayesian method for the induction of probabilistic networks from data. Mach Learn 9:309–342 · Zbl 0766.68109
[10] Cox DR (1955) A use of complex probabilities in the theory of stochastic processes. Proc Camb Phil Soc 51:313–319 · Zbl 0066.37703 · doi:10.1017/S0305004100030231
[11] Faddy M (1994) Examples of fitting structured phase-type distributions. Appl Stoch Models Bus Ind 10: 247–255 · Zbl 0822.60085
[12] Faddy M, McClean SI (1999) Analysing data on length of stay of hospital patients using phase-type distributions. Appl Stoch Models Bus Ind 15:311–317 · Zbl 0966.60072 · doi:10.1002/(SICI)1526-4025(199910/12)15:4<311::AID-ASMB395>3.0.CO;2-S
[13] Friedman N, Goldszmidt M (1996) Building classifiers using Bayesian networks. In: Proceedings of the national conference on artificial intelligence, Menlo Park. AAAI Press, pp 1277–1284
[14] Fung RM, Crawford SL (1990) Constructor: a system for the induction of probabilistic models. In: Proceedings of the seventh national conference on artificial intelligence
[15] Gammerman A, Luo Z, Aitken CGG, Brewer MJ (1994) Computational systems for mixed graphical models. Presented at UNICOM, Conference Centre, London. Adoptive computing and information processing, pp 143–162
[16] Havranek T (1984) A procedure for model search multidimensional contingency tables. Biometrics 40: 95–100
[17] Heckerman D, Geiger D, Chickering DM (1994) Learning Bayesian networks: the combination of knowledge and statistical data. Technical report MSR-TR-94-09, Microsoft Research · Zbl 0831.68096
[18] H $$\(\backslash\)phi $$ jsgaard S, Thiesson B (1995) BIFROST–block recursive models induced from relevant knowledge, observations and statistical techniques. Comput Stat Data Anal 19:155–175 · Zbl 0875.62566
[19] Ishlay E (2002) Fitting phase-type distributions to data from a telephone call center. Israel Institute of Technology, Research thesis
[20] Jenson F (1996) An introduction to Bayesian belief networks. Springer, New York
[21] Johnson MA, Taaffe MR (1990) Matching moments to match phase distributions: density function shapes. Commun Stat Stoch Models 6:283–306 · Zbl 0711.60016 · doi:10.1080/15326349908807148
[22] Kahn CE Jr, Roberts LM, Shaffer KA, Haddawy P (1997) Construction of Bayesian networks for mammographic diagnosis of breast cancer. Comput Biol Med 27(1):19–29 · doi:10.1016/S0010-4825(96)00039-X
[23] Lauritzen SL, Wermuth N (1989) Graphical models for association between variables, some of which are qualitative and some quantitative. Ann Stat 17:31–57 · Zbl 0669.62045 · doi:10.1214/aos/1176347003
[24] Lerner U, Parr R, Koller D, Biswas G (2000) Bayesian fault detection and diagnosis in dynamic systems. In: Proceedings of the 17th national conference on artificial intelligence (AAAI)
[25] Marshall AH (2001) Bayesian belief networks using conditional phase-type distributions. PhD thesis, University of Ulster, UK
[26] Marshall AH, McClean SI (2003) Conditional phase-type distributions for modelling patient length of stay in hospital. Int Trans Oper Res 10:565–576 · Zbl 1046.60014 · doi:10.1111/1475-3995.00428
[27] Marshall AH, McClean SI, Shapcott CM (1999) Using Bayesian belief networks to predict the survival of stroke patients. In: IX international symposium on applied stochastic models and data analysis, Lisbon, Portugal, pp 112–116
[28] Marshall AH, McClean SI, Millard PH (2004) Addressing bed costs for the elderly: a new methodology for modelling patient outcomes and length of stay. Health Care Manag Sci 7:27–33 · doi:10.1023/B:HCMS.0000005395.77308.d1
[29] MATLAB (2003) Reference guide. The MathWorks Inc
[30] Mitra SK, Lee TW, Goldbaum MH (2005) A Bayesian network based sequential inference for diagnosis of diseases for retinal images. Pattern Recogn Lett 26(4):459–470 · doi:10.1016/j.patrec.2004.08.010
[31] Murphy K (1998) Inference and learning in hybrid Bayesian networks. U.C. Berkeley Technical Report CSD-98-990
[32] Murphy K (2001) The Bayes net toolbox for MATLAB. Comput Sci Stat 33:1–20
[33] Niedermayer D (1998) An introduction to Bayesian networks and their comtemporary applications. University of Saskatchewan, Technical Report 184-3-5440
[34] Olesen KG (1993) Causal probabilistic networks with both discrete and continuous variables. IEEE Trans Pattern Anal Mach Intell 15(3):275–279 · Zbl 05110956 · doi:10.1109/34.204909
[35] Onisko A, Druzdzel MJ, Wasyluk H (2000) Extension of the Hepar II model to multiple-disorder disgnosis. In: Wierzchon ST, Klopotek M, Michalewicz M (eds) Intelligent information systmes. Advances in soft computing series. Physica-Verlag, Heidelberg, pp 303–313
[36] Pearl J (1988) Probabilistic reasoning in intelligent systems. Morgan Kauffmann · Zbl 0649.68104
[37] Rebane G, Pearl J (1987) The recovery of causal poly-trees from statistical data. In: Proceedings of workshop on uncertainty in artificial intelligence, Seattle, pp 222–228
[38] Sahami M, Dumais S, Heckerman D (1989) A Bayesian approach to filtering junk email. In: Presented at AAAI workshop on learning for text categorization, Madison, Wisconsin. AAAI Technical Report WS-98-05
[39] Schwarz GE (1978) Estimating the dimension of a model. Ann Stat 6(2):461–464 · Zbl 0379.62005 · doi:10.1214/aos/1176344136
[40] Shaw B, Marshall AH (2006) Modeling the health care costs of geriatric inpatients. IEEE Trans Inf Technol Biomed 10(3):526–532 · Zbl 05456231 · doi:10.1109/TITB.2005.863821
[41] Sterritt R, Marshall AH, Shapcott CM, McClean SI (2000) Exploring dynamic Bayesian belief networks for intelligent fault management systems. In: Proc IEEE Int Conf SMC, pp 3646–3652
[42] Stutz J, Taylor W, Cheeseman P (1989) AutoClass C: general information. NASA, Ames Research Centre
[43] Suzuki J (1996) Learning Bayesian belief networks based on the MDL principle: an efficient algorithm using the branch and bound technique. In: Proceedings of the international conference on machine learning, Bari, Italy
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.