A dynamic all parameters adaptive BP neural networks model and its application on oil reservoir prediction. (English) Zbl 1205.86035

Summary: A dynamic all parameters adaptive BP neural networks model is proposed by fusing genetic algorithms (GAs), simulated annealing (SA) and error back propagation neural network (BPNN) to offset the demerits of one paradigm by the merits of another. Adopting multi-encoding, the model can optimize the input nodes, hidden nodes, transfer function, weights and bias of BP networks dynamically and adaptively. Under accurate premise, the simple architecture (less input and hidden nodes) of network model is constructed in order to improve networks’ adaptation and generalization ability, and to greatly reduce the subjective choice of structural parameters. The results of application on oil reservoir prediction show that the proposed model with comparatively simple structure can meet the precision request and enhance the generalization ability.


86A20 Potentials, prospecting
68T05 Learning and adaptive systems in artificial intelligence
Full Text: DOI


[1] Statake, T.; Katsumi, M.; Nakamura, N., Neural network approach for minimizing the makespan of the general job-shop, International Journal of Production Economics, 33, 67-74 (1994)
[2] Bode, J., Neural networks for cost estimation: simulations and pilot application, International Journal of Production Research, 38, 6, 1231-1254 (2000) · Zbl 0944.90528
[3] Sabuncuoglu, I.; Gurgun, B., A neural network model for scheduling problems, European Journal of Operational Research, 93, 288-299 (1996) · Zbl 0913.90180
[4] Zhu, Q. M., A back propagation algorithm to estimate the parameters of non-linear dynamic rational models, Applied Mathematical Modeling, 27, 169-187 (2003) · Zbl 1033.93064
[5] Olabia, A. G.; Casalino, G., An ANN and Taguchi algorithms integrated approach to the optimization of \(CO_2\) laser welding, Advances in Engineering Software, 37, 643-648 (2006)
[6] Loukas, Y. L., Artificial neural networks in liquid chromatography: Efficient and improved quantitative structure-retention relationship models, Journal of Chromatography A, 904, 119-129 (2000)
[7] Khaw, J. F.C.; Lim, B. S.; Lim, L. E.N., Optimal design of neural networks using the Taguchi method, Neurocomputing, 7, 225-245 (1995) · Zbl 0825.68537
[8] Maier, H. R.; Dandy, G. C., The effect of internal parameters and geometry on the performance of back-propagation neural networks: an empirical study, Environmental Modeling and Software, 13, 193-209 (1998)
[9] Maier, H. R.; Dandy, G. C., Understanding the behavior and optimizing the performance of back-propagation neural networks: an empirical study, Environmental Modeling and Software, 13, 179-191 (1998)
[10] Benardos, P. G.; Vosniakos, G.-C., Prediction of surface roughness in CNC face milling using neural networks and Taguchi’s design of experiments, Robotics and Computer Integrated Manufacturing, 18, 343-354 (2002)
[11] Ross, J. P., Taguchi Techniques for Quality Engineering (1996), McGraw- Hill: McGraw- Hill New York
[12] Tang, Wanmei, The study of the optimal structure of BP neural network, Systems Engineering - Theory and Practice, 10, 95-100 (2005)
[13] Fahlman, S. E.; Lebiere, C., The Cascade-Correlation Learning Architecture, Advances in Neural Information Systems, vol. 2 (1990), Morgan- Kaufmann: Morgan- Kaufmann Los Altos, CA
[14] Balkin, S. D.; Ord, J. K., Automatic neural network modeling for univariate time series, International Journal of Forecasting, 16, 509-515 (2000)
[15] Islam, M. M.; Murase, K., A new algorithm to design compact two hidden-layer artificial neural networks, Neural Networks, 14, 1265-1278 (2001)
[16] Jiang, X.; Wah, A. H.K. S., Constructing and training feed-forward neural networks for pattern classification, Pattern Recognition, 36, 853-867 (2003)
[17] Ma, L.; Khorasani, K., A new strategy for adaptively constructing multilayer feed forward neural networks, Neurocomputing, 51, 361-385 (2003)
[18] Castillo, P. A.; Merelo, J. J.; Prieto, A.; Rivas, V.; Romero, G., GProp: global optimization of multilayer perceptrons using gas, Neurocomputing, 35, 149-163 (2000) · Zbl 1003.68625
[19] Arifovic, J.; Gencay, R., Using genetic algorithms to select architecture of a feed forward artificial neural network, Physica A, 289, 574-594 (2001) · Zbl 0971.68506
[20] Harri, N.; Teri, H., Evolving the neural network model for forecasting air pollution time series, Engineering Applications of Artificial Intelligence, 17, 159-167 (2004)
[21] Zhang, Guangzheng; Huang, Deshuang, Prediction of inter-residue contacts map based on genetic algorithm optimized radial basis function neural network and binary input encoding scheme, Journal of Computer-Aided Molecular Design, 18, 797-810 (2004)
[22] Li, Shujuan; Li, Yan, A GA-based NN approach for makespan estimation, Applied Mathematics and Computation, 185, 1003-1014 (2007) · Zbl 1142.68476
[23] Kirkpatrick, S.; Gerlatt, C. D.; Vecchi, M. P., Optimization by simulated annealing, Science, 220, 671-680 (1983) · Zbl 1225.90162
[24] Poranen, Timo, A simulated annealing algorithm for determining the thickness of a graph, Information Sciences, 172, 155-172 (2005) · Zbl 1087.68074
[25] Li, Zhang; Ganesh, S., An evaluation of back-propagation neural networks for the optimal design of structural systems: Part I. Training procedures, Computer Methods in Applied Mechanics and Engineering, 191, 2873-2886 (2002) · Zbl 1131.74332
[26] Srinivas, M.; Patnaik, L. M., Adaptive probabilities of crossover and mutation in genetic algorithms, IEEE Transaction on System, Man and Cybernetic, 4, 656-667 (1994)
[29] Li, Yibao; Zhang, Xueyong, Study of improving algorithms based on the BP neural network [J], Journal of Hefei University of Technology, 6, 668-671 (2005)
[30] Wang, Ling, Intelligent Optimization Algorithms with Applications (2001), Tsinghua Press: Tsinghua Press Beijing
[31] Hou, Fujun; Wu, Qizong, Forecast on temporal sequence of railway freight transport volume based on BP-SA mixing and optimizing solution, Railway Transport and Economy, 10, 51-53 (2003)
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.