# zbMATH — the first resource for mathematics

Selection of artificial neural network models for survival analysis with genetic algorithms. (English) Zbl 1452.62792
Summary: In follow-up clinical studies, the main time end-point is the failure from a specific starting point (e.g. treatment, surgery). A deeper investigation concerns the causes of failure. Statistical analysis typically focuses on the study of the cause specific hazard functions of possibly censored survival data. In the framework of discrete time models and competing risks, a multilayer perceptron was already proposed as an extension of generalized linear models with multinomial errors using a non-linear predictor (PLANNCR). According to standard practice, weight-decay was adopted to modulate model complexity. A Genetic Algorithm is considered for the complexity control of PLANNCR allowing to regularize independently each parameter of the model. The ICOMP information criterion is used as fitness function. To demonstrate the criticality and the benefits of the technique an application to a case series of 1793 women with primary breast cancer without axillary lymph node involvement is presented.

##### MSC:
 62P10 Applications of statistics to biology and medical sciences; meta analysis 62B10 Statistical aspects of information-theoretic topics 62-08 Computational methods for problems pertaining to statistics 68T05 Learning and adaptive systems in artificial intelligence 90C59 Approximation methods and heuristics in mathematical programming
##### Software:
BRENT; Hmisc; MASS (R); R
Full Text:
##### References:
 [1] Akaike, H., 1973. Information theory and an extension of the maximum likelihood principle. In: Petrov, B.N., Csaki, F. (Eds.), Second International Symposium on Information Theory. · Zbl 0283.62006 [2] Berger, J.O., Statistical decision theory and Bayesian analysis, (1985), Springer New York · Zbl 0572.62008 [3] Biganzoli, E.; Boracchi, P.; Mariani, L.; Marubini, E., Feed forward neural networks for the analysis of censored survival data: a partial logistic regression approach, Statist. med., 17, 10, 1169-1186, (1998) [4] Biganzoli, E.; Boracchi, P.; Marubini, E., A general framework for neural network models on censored survival data, Neural networks, 15, 2, 209-218, (2002) [5] Biganzoli, E.; Boracchi, P.; Coradini, D.; Daidone, M.G.; Marubini, E., Prognosis in node-negative primary breast cancer: a neural network analysis of risk profiles using routinely assessed factors, Ann. oncol., 14, 1484-1493, (2003) [6] Biganzoli, E.; Boracchi, P.; Ambrogi, F.; Marubini, E., Artificial neural network models for the joint modeling of discrete cause specific hazards, Artificial intelligence med., 37, 119-130, (2006) [7] Bishop, C.M., Neural networks for pattern recognition, (1995), Clarendon Press Oxford [8] Bozdogan, H., Model selection and Akaike’s information criterion (AIC): the general theory and its analytical extensions, Psychometrika, 52, 3, 345-370, (1987) · Zbl 0627.62005 [9] Bozdogan, H., Akaike’s information criterion and recent developments in information complexity, J. math. psych., 44, 62-91, (2000) · Zbl 1047.62501 [10] Brent, R., Algorithms for minimization without derivatives, (1973), Prentice-Hall Englewood Cliffs, NJ · Zbl 0245.65032 [11] Chatterjee, S.; Laudato, M.; Lucy, A.L., Genetic algorithms and their statistical applications: an introduction, Comput. statist. data anal., 22, 633-651, (1996) · Zbl 0900.62336 [12] Davis, L., Handbook of genetic algorithms, (1991), Van Nostrand Reinhold New York, NY [13] De Jong, K.A., Spears, W., 1990. An analysis of the interacting roles of population size and crossover in genetic algorithms. Proceedings of the First International Conference on Parallel Problem Solving from Nature. Morgan Kaufman, Los Altos, CA. [14] Efron, B., Logistic regression, survival analysis and the kaplan – meyer curve, J. amer. statist. assoc., 83, 414-425, (1988) · Zbl 0644.62100 [15] Geisser, S., The predictive sample reuse method with applications, J. amer. statist. assoc., 50, 320-328, (1975) · Zbl 0321.62077 [16] Geman, S.; Bienenstock, E.; Doursat, D., Neural networks and the bias/variance dilemma, Neural comput., 4, 1-58, (1992) [17] Gray, R.J., Hazard rate regression using ordinary nonparametric regression smoothers, J. comput. graph. statist., 5, 190-207, (1996) [18] Harrell, Jr., F.E., and with contributions from many other users, 2006. Hmisc: Harrell Miscellaneous. R package version 3.1-2. $$\langle$$http://biostat.mc.vanderbilt.edu/s/Hmisc⟩, ⟨http://biostat.mc.vanderbilt.edu/twiki/pub/Main/RS/sintro.pdf⟩, ⟨http://biostat.mc.vanderbilt.edu/twiki/pub/Main/StatReport/summary.pdf⟩. [19] Holland, J.H., Adaptation in natural and artificial systems, (1975), University of Michigan Press MI, USA [20] Linhart, H.; Zucchini, W., Model selection, (1986), Wiley New York · Zbl 0665.62003 [21] Lisboa, P.J.; Wong, H.; Harris, P.; Swindell, R., A Bayesian neural network approach for modeling censored data with an application to prognosis after surgery for breast cancer, Artificial intelligence med., 28, 1, 1-25, (2003) [22] Marubini, E.; Valsecchi, M.G., Analysing survival data from clinical trials and observational studies, (1995), Wiley Chichester · Zbl 0837.62090 [23] McCullagh, P., Nelder, J.A., 1989. Generalized Linear Models. Chapman & Hall, London, pp. 23-25. · Zbl 0744.62098 [24] Mitchell, M., An introduction to genetic algorithms, (1996), MIT Press Cambridge [25] Moody, J., Prediction risk and architecture selection for neural networks, () · Zbl 0837.68101 [26] Moody, J.E., 1992. The effective number of parameters: an analysis of generalization and regularization in nonlinear learning systems. In: Moody, J.E., Hanson, S.J., Lippmann, R.P. (Eds.), Advances in Neural Information Processing Systems, vol. 4, Morgan Kaufmann, San Mateo, CA, pp. 847-854. [27] Murata, N.; Yoshizawa, S.; Amari, S., Network information criterion—-determining the number of hidden units for artificial neural network models, IEEE trans. neural networks, 5, 865-872, (1994) [28] Nelder, J.A.; Mead, R., A simplex algorithm for function minimization, Comput. J., 7, 308-313, (1965) · Zbl 0229.65053 [29] R Development Core Team, 2006. R: a language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. URL $$\langle$$http://www.R-project.org⟩. [30] Ripley, B.D., Pattern recognition and neural networks, (1996), Cambridge University Press Cambridge · Zbl 0853.62046 [31] Schaffer, J.D.; Caruana, R.A.; Eshelman, L.J.; Das, R., A study of control parameters affecting online performance of genetic algorithms for function optimization, () [32] Shapiro, J., Prügel-Bennett, A., Rattray, M., 1994. A statistical mechanical formulation of the dynamics of genetic algorithms. In: Lecture Notes in Computer Science, vol. 865, pp. 17-27. [33] Stone, M., Cross-validatory choice and assessment of statistical predictions, J. roy. statist. soc. B, 36, 111-147, (1974) · Zbl 0308.62063 [34] Syswerda, G., 1989. Uniform crossover in genetic algorithms. In: Schaffer, H. (Ed.), Third International Conference on Genetic Algorithms, vols. 2-9, Morgan Kaufmann, San Mateo. [35] Urmanov, A.M.; Gribok, A.V.; Hines, J.W.; Uhrig, R.E., An information approach to regularization parameter selection under model misspecification, Inverse problems, 18, 1207-1228, (2002) · Zbl 1022.62004 [36] Vapnik, V.N., The nature of statistical learning, (1998), Springer New York · Zbl 0934.62009 [37] Venables, W.N.; Ripley, B.D., Modern applied statistics with S, (2002), Springer Berlin · Zbl 1006.62003 [38] Wahba, G., Three topics in ill posed inverse problems, () · Zbl 0638.65048
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.