Example-based learning particle swarm optimization for continuous optimization. (English) Zbl 1250.90113

Summary: Particle swarm optimization (PSO) is a heuristic optimization technique based on swarm intelligence that is inspired by the behavior of bird flocking. The canonical PSO has the disadvantage of premature convergence. Several improved PSO versions do well in keeping the diversity of the particles during the searching process, but at the expense of rapid convergence. This paper proposes an example-based learning PSO (ELPSO) to overcome these shortcomings by keeping a balance between swarm diversity and convergence speed. Inspired by a social phenomenon that multiple good examples can guide a crowd towards making progress, ELPSO uses an example set of multiple global best particles to update the positions of the particles. In this study, the particles of the example set were selected from the best particles and updated by the better particles in the first-in-first-out order in each iteration. The particles in the example set are different, and are usually of high quality in terms of the target optimization function. ELPSO has better diversity and convergence speed than single-gbest and non-gbest PSO algorithms, which is proved by mathematical and numerical results. Finally, computational experiments on benchmark problems show that ELPSO outperforms all of the tested PSO algorithms in terms of both solution quality and convergence time.


90C59 Approximation methods and heuristics in mathematical programming
68T05 Learning and adaptive systems in artificial intelligence
68T20 Problem solving in the context of artificial intelligence (heuristics, search strategies, etc.)
Full Text: DOI


[2] Clerc, M.; Kennedy, J., The Particle swarm-explosion, stability, and convergence in a multidimensional complex space, IEEE Transactions on Evolutionary Computation, 6, 1, 58-73 (2002)
[3] Coelho, L. S., An efficient particle swarm approach for mixed-integer programming in reliability-redundancy optimization applications, Reliability Engineering and System Safety, 94, 4, 830-837 (2009)
[4] Coelho, L. S.; Mariani, V. C., Particle swarm approach based on quantum mechanics and harmonic oscillator potential well for economic load dispatch with valve-point effects, Energy Conversion and Management, 49, 11, 3080-3085 (2008)
[5] Cui, Z.; Zeng, J.; Sun, G., Levy velocity threshold particle swarm optimization, ICIC Express Letters, 2, 1, 23-28 (2008)
[6] Cui, Z.; Cai, X.; Zeng, J., Chaotic performance-dependent particle swarm optimization, International Journal of Innovative Computing, Information and Control, 5, 4, 951-960 (2009)
[7] Du, W.; Li, B., Multi-strategy ensemble particle swarm optimization for dynamic optimization, Information Sciences, 178, 15, 3096-3109 (2008) · Zbl 1283.90047
[10] Jiang, Y.; Hu, T.; Huang, C. C.; Wu, X., An improved particle swarm optimization algorithm, Applied Mathematics and Computation, 193, 1, 231-239 (2007) · Zbl 1193.90220
[11] Kang, Q.; Wang, L.; Wu, Q., A novel ecological particle swarm optimization algorithm and its population dynamics analysis, Applied Mathematics and Computation, 205, 1, 61-72 (2008) · Zbl 1152.92353
[15] Khalid, N. K.; Ibrahim, Z.; Kurniawan, T. B.; Khalid, M.; Sarmin, N. H., Function minimization in DNA sequence design based on continuous particle swarm optimization, ICIC Express Letters, 3, 1, 27-32 (2009)
[17] Lai, C. C.; Wu, C. H.; Tsai, M. C., Feature selection using particle swarm optimization with application in spam filtering, International Journal of Innovative Computing, Information and Control, 5, 2, 423-432 (2009)
[18] Lovbjerg, M.; Rasmussen, T. K.; Krink, T., Hybrid particle swarm optimizer with breeding and subpopulations, (Proceedings of Genetic and Evolutionary Computation Conference, GECCO-2001 (2001), Morgan Kaufman: Morgan Kaufman San Francisco), 469-476
[19] Liang, J. J.; Qin, A. K.; Suganthan, P. N.; Baskar, S., Comprehensive learning particle swarm optimizer for global optimization of multimodal function, IEEE Transactions on Evolutionary Computation, 10, 3, 281-295 (2006)
[20] Liu, Y.; Qin, Z.; Shi, Z.; Lu, J., Center particle swarm optimization, Neurocomputing, 70, 4-6, 672-679 (2007)
[21] Mendes, R.; Kennedy, J.; Neves, J., The fully informed particle swarm: simpler, maybe better, IEEE Transactions on Evolutionary Computation, 8, 3, 204-210 (2004)
[22] Ratnaweera, A.; Halgamuge, S.; Watson, H., Self-organizing hierarchical particle swarm optimizer with time varying accelerating coefficients, IEEE Transactions on Evolutionary Computation, 8, 3, 240-255 (2004)
[24] Pasti, R.; de Castro, L. N., Bio-inspired and gradient-based algorithms to train MLPs: the influence of diversity, Information Sciences, 179, 10, 1441-1453 (2009)
[25] Shi, Y.; Eberhart, R. C., A modified particle swarm optimizer, (Proceedings of the IEEE International Conference on Evolutionary Computation (1998), IEEE Press: IEEE Press Piscataway, NJ), 69-73
[29] Tripathi, P. K.; Bandyopadhyay, S.; Pal, S. K., Multi-objective particle swarm optimization with time variant inertia and acceleration coefficients, Information Sciences, 177, 22, 5033-5049 (2007) · Zbl 1121.90130
[30] Wang, Y. J.; Yang, Y. P., Particle swarm optimization with preference order ranking for multi-objective optimization, Information Sciences, 179, 12, 1944-1959 (2009)
[31] van den Bergh, F.; Engelbrecht, A. P., A cooperative approach to particle swarm optimization, IEEE Transactions on Evolutionary Computation, 8, 3, 225-239 (2004)
[32] Zhao, X., A perturbed particle swarm algorithm for numerical optimization, Applied Soft Computing, 10, 1, 119-124 (2010)
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.