An Improved Bat Algorithm with Grey Wolf Optimizer for Solving Continuous Optimization Problems
الموضوعات :narges jafari 1 , Farhad Soleimanian Gharehchopogh 2
1 - Department of Computer Engineering, Urmia branch, Islamic Azad University, Urmia, Iran
2 - Department of Computer Engineering, Urmia Branch, Islamic Azad University, Urmia, IRAN
الکلمات المفتاحية: Optimization, Gray wolf optimizer, Continuous Problems, Bat algorithm,
ملخص المقالة :
Metaheuristic algorithms are used to solve NP-hard optimization problems. These algorithms have two main components, i.e. exploration and exploitation, and try to strike a balance between exploration and exploitation to achieve the best possible near-optimal solution. The bat algorithm is one of the metaheuristic algorithms with poor exploration and exploitation. In this paper, exploration and exploitation processes of Gray Wolf Optimizer (GWO) algorithm are applied to some of the solutions produced by the bat algorithm. Therefore, part of the population of the bat algorithm is changed by two processes (i.e. exploration and exploitation) of GWO; the new population enters the bat algorithm population when its result is better than that of the exploitation and exploration operators of the bat algorithm. Thereby, better new solutions are introduced into the bat algorithm at each step. In this paper, 20 mathematic benchmark functions are used to evaluate and compare the proposed method. The simulation results show that the proposed method outperforms the bat algorithm and other metaheuristic algorithms in most implementations and has a high performance.
1. Abedi, M. and F.S. Gharehchopogh, An improved opposition based learning firefly algorithm with dragonfly algorithm for solving continuous optimization problems. Intelligent Data Analysis, 2020. 24(2): p. 309-338.
2. Gharehchopogh, F.S. and H. Gholizadeh, A comprehensive survey: Whale Optimization Algorithm and its applications. Swarm and Evolutionary Computation, 2019. 48: p. 1-24.
3. Shayanfar, H. and F.S. Gharehchopogh, Farmland fertility: A new metaheuristic algorithm for solving continuous optimization problems. Applied Soft Computing, 2018. 71: p. 728-746.
4. Amjad, S.S.G., Farhad, A Novel Hybrid Approach for Email Spam Detection based on Scatter Search Algorithm and K-Nearest Neighbors. Journal of Advances in Computer Engineering and Technology, 2019. 5(3): p. 169-178.
5. Khanalni, S. and F.S. Gharehchopogh, A New Approach for Text Documents Classification with Invasive Weed Optimization and Naive Bayes Classifier. Journal of Advances in Computer Engineering and Technology, 2018. 4(3): p. 31-40.
6. Allahverdipour, A. and F. Soleimanian Gharehchopogh, An Improved K-Nearest Neighbor with Crow Search Algorithm for Feature Selection in Text Documents Classification. Journal of Advances in Computer Research, 2018. 9(2): p. 37-48.
7. Majidpour, H. and F. Soleimanian Gharehchopogh, An Improved Flower Pollination Algorithm with AdaBoost Algorithm for Feature Selection in Text Documents Classification. Journal of Advances in Computer Research, 2018. 9(1): p. 29-40.
8. Gharehchopogh, F.S., S.R. Khaze, and I. Maleki, A new approach in bloggers classification with hybrid of k-nearest neighbor and artificial neural network algorithms. Indian Journal of Science and technology, 2015. 8(3): p. 237.
9. Geem, Z.W., J.H. Kim, and G.V. Loganathan, A new heuristic optimization algorithm: harmony search. simulation, 2001. 76(2): p. 60-68.
10. Rabani, H. and F. Soleimanian Gharehchopogh, An Optimized Firefly Algorithm based on Cellular Learning Automata for Community Detection in Social Networks. Journal of Advances in Computer Research, 2019. 10(3): p. 13-30.
11. Gharehchopogh, F.S., H. Shayanfar, and H. Gholizadeh, A comprehensive survey on symbiotic organisms search algorithms. Artificial Intelligence Review, 2019: p. 1-48.
12. Osmani, A., J.B. Mohasefi, and F.S. Gharehchopogh, Sentiment Classification Using Two Effective Optimization Methods Derived From The Artificial Bee Colony Optimization And Imperialist Competitive Algorithm. The Computer Journal, 2020.
13. Hasanluo, M. and F. Soleimanian Gharehchopogh, Software cost estimation by a new hybrid model of particle swarm optimization and k-nearest neighbor algorithms. Journal of Electrical and Computer Engineering Innovations, 2016. 4(1): p. 49-55.
14. Holland, J.H., Genetic algorithms. Scientific american, 1992. 267(1): p. 66-73.
15. Kennedy, J., Particle swarm optimization, in Encyclopedia of machine learning. 2011, Springer. p. 760-766.
16. Storn, R. and K. Price, Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. Journal of global optimization, 1997. 11(4): p. 341-359.
17. Karaboga, D., An idea based on honey bee swarm for numerical optimization. 2005, Technical report-tr06, Erciyes university, engineering faculty, computer engineering department.
18. Yang, X.-S., Firefly algorithm. Nature-inspired metaheuristic algorithms, 2008. 20: p. 79-90.
19. Yang, X.-S., A new metaheuristic bat-inspired algorithm, in Nature inspired cooperative strategies for optimization (NICSO 2010). 2010, Springer. p. 65-74.
20. Mirjalili, S., S.M. Mirjalili, and A. Lewis, Grey wolf optimizer. Advances in engineering software, 2014. 69: p. 46-61.
21. Zhang, J.W. and G.G. Wang. Image matching using a bat algorithm with mutation. in Applied Mechanics and Materials. 2012. Trans Tech Publ.
22. Saha, S.K., et al., A new design method using opposition-based BAT algorithm for IIR system identification problem. International Journal of Bio-Inspired Computation, 2013. 5(2): p. 99-132.
23. Yılmaz, S., E.U. Kucuksille, and Y. Cengiz, Modified bat algorithm. Elektronika ir Elektrotechnika, 2014. 20(2): p. 71-78.
24. Li, L. and Y. Zhou, A novel complex-valued bat algorithm. Neural Computing and Applications, 2014. 25(6): p. 1369-1381.
25. Cai, X., et al., Bat algorithm with Gaussian walk. International Journal of Bio-Inspired Computation, 2014. 6(3): p. 166-174.
26. Dao, T.-K., et al., Compact bat algorithm, in Intelligent Data analysis and its Applications, Volume II. 2014, Springer. p. 57-68.
27. Fister, I., S. Fong, and J. Brest, A novel hybrid self-adaptive bat algorithm. The Scientific World Journal, 2014. 2014.
28. Jun, L., L. Liheng, and W. Xianyi, A double-subpopulation variant of the bat algorithm. Applied Mathematics and Computation, 2015. 263: p. 361-377.
29. Wang, G.-G., H.E. Chu, and S. Mirjalili, Three-dimensional path planning for UCAV using an improved bat algorithm. Aerospace Science and Technology, 2016. 49: p. 231-238.
30. Liu, Q., et al., A novel hybrid bat algorithm for solving continuous optimization problems. Applied Soft Computing, 2018. 73: p. 67-82.
31. Chakri, A., et al., New directional bat algorithm for continuous optimization problems. Expert Systems with Applications, 2017. 69: p. 159-175.
32. Purkait, G., et al., An Improved Bio-inspired BAT Algorithm for Optimization, in Progress in Advanced Computing and Intelligent Engineering. 2019, Springer. p. 241-248.
33. Asghari Agcheh Dizaj, S. and F. Soleimanian Gharehchopogh, A New Approach to Software Cost Estimation by Improving Genetic Algorithm with Bat Algorithm. Journal of Computer & Robotics, 2018. 11(2): p. 17-30.
1
Journal of Advances in Computer Engineering and Technology
An Improved Bat Algorithm with Grey Wolf Optimizer for Solving Continuous Optimization Problems
Received (Day Month Year)
Revised (Day Month Year)
Accepted (Day Month Year)
Abstract— Metaheuristic algorithms are used to solve NP-hard optimization problems. These algorithms have two main components, i.e. exploration and exploitation, and try to strike a balance between exploration and exploitation to achieve the best possible near-optimal solution. The bat algorithm is one of the metaheuristic algorithms with poor exploration and exploitation. In this paper, exploration and exploitation processes of Gray Wolf Optimizer (GWO) algorithm are applied to some of the solutions produced by the bat algorithm. Therefore, part of the population of the bat algorithm is changed by two processes (i.e. exploration and exploitation) of GWO; the new population enters the bat algorithm population when its result is better than that of the exploitation and exploration operators of the bat algorithm. Thereby, better new solutions are introduced into the bat algorithm at each step. In this paper, 20 mathematic benchmark functions are used to evaluate and compare the proposed method. The simulation results show that the proposed method outperforms the bat algorithm and other metaheuristic algorithms in most implementations and has a high performance.
Index Terms— Bat algorithm, Gray wolf optimizer, Continuous Problems, Optimization.
I. INTRODUCTION
O
ptimization means finding the best possible or desirable solution to the problem that is commonly encountered in all disciplines of engineering and science. Optimization problems include a wide range of problems. Optimization algorithms occurring in nature can be either deterministic or random[1, 2]. Deterministic methods and algorithms for solving optimization problems require huge complex calculations that increase the probability of failure. Therefore, the exact algorithms are able to find the optimal solution accurately; however, they are not efficient in the case of NP-hard problems and their execution time increases exponentially with the increase of the dimensions of the optimization problem. Approximation algorithms are able to find the optimal or appropriate solution at a good execution time with high efficiency in solving optimization problems. The use of nature-inspired random optimization algorithms with efficient computations has been suggested rather than deterministic methods and algorithms [3]. Heuristic and metaheuristic algorithms are approximation methods in solving optimization problems; heuristic algorithms most often seek proper near optimal solutions in acceptable computational time [4-6]. However, heuristic algorithms do not guarantee optimal solutions and the main drawbacks of heuristic algorithms include the production of a limited number of solutions, trapping in local optima and early convergence. Due to the disadvantages of heuristic algorithms, metaheuristic algorithms have been introduced. Each metaheuristic algorithm uses its own methods to get out of the local optima trap or to avoid getting trapped in the local optima. Metaheuristic algorithms are presented to solve optimization problems and improve the ability to find high quality solutions to all optimization problems without getting trapped in local optima.
Metaheuristic optimization algorithms are becoming more and more popular in engineering applications [4-13]. Because 1) they have relatively simple concepts and are easy to implement, 2) they do not need objective function derivative data, 3) they can avoid getting trapped in local optima, 4) they can be used in a wide variety of problems. Nature-inspired metaheuristic algorithms solve optimization problems through mimicking biological or physical phenomena. These algorithms can be grouped into three main categories: evolution-based methods, physics-based methods, and congestion-based methods. Evolution-based approaches are inspired by the law of natural evolution.
The first and most popular metaheuristic algorithm is called Genetic Algorithm, introduced by Holland in 1992 [14], followed by the popular particle swarm optimization (PSO) algorithm first presented by Kennedy and Abraham [15] and later, their modified and improved versions were also developed. After presenting these two basic and robust metaheuristic algorithms, later many other metaheuristic algorithms were introduced for solving optimization problems, one of which is the differential evolution algorithm [16] which is one of the stochastic optimization methods applying mutation, crossover, and selection operators on every generation of population to achieve the global optimum. The artificial bee algorithm (ABC) is an population-based stochastic global optimization approach inspired by foraging behavior of honey bee colony in nature to solve continuous optimization problems with large spaces[17]. Firefly algorithm, inspired by the flashing behavior of the fireflies, was introduced by Yang in 2008 [18].
The bat algorithm [19] was developed by Young in 2010. This algorithm is inspired by the echolocation behavior of micro-bats. The bats emit a very loud pulse and listen to the reflection. To overcome the early and the late convergence and to avoid getting trapped in local optima, the weakness of a metaheuristic algorithm is first identified and then covered with another metaheuristic algorithm that does not have that problem or weakness. Therefore, in this case, both hybrid algorithms cover each other’s weaknesses and improve the performance of the algorithm. This method is mostly used in hybridization of metaheuristic algorithms because the operation of the algorithm is already tested and does not create new complexity and problems. GWO is a new evolutionary algorithm introduced in 2014 [20], which has attracted the interest of many researchers in application related to optimization of various problems because the simulation results show the superiority of the GWO over some other robust metaheuristic algorithms.
The main purpose in hybrid metaheuristic algorithms is based on three principles[1]. The first principle is to hybrid two different algorithmic to cover their weaknesses. The second principle is that we seek to quickly achieve global optimum by hybrid the processes of two algorithms. The third principle, which is the most important one for all metaheuristic algorithms, is to maintain a balance between exploration and exploitation by hybrid the processes of two metaheuristic algorithms. Of course, by keeping the balance between exploration and exploitation the algorithm will perform better and try to get as close to the optimal global solution as possible.
In this paper, the bat algorithm is improved with the use of GWO to cover the weaknesses of the bat algorithm in solving continuous optimization problems. The bat algorithm has poor exploration and exploitation, and to cover this weaknesses of the algorithm, the operators of the GWO algorithm have been used. In the proposed method, the exploration and exploitation process of GWO are applied on some solutions produced by the bat algorithm. In this way, some solutions of solving the bat algorithm can be optimized sooner and increase its convergence. On the other hand, a new exploration operator can escape the local trap of the bat algorithm.
The rest of the paper is organized as follows: Section 2 considers the previous works. Section 3 describes the fundamental research, and Section 4 discusses the proposed method. Section 5 provides the obtained results. Section 6 includes conclusion and future work.
II. Previous WORKS
Zhang and Wang proposed a version of the bat algorithm for image processing [21]. They applied two changes to the original bat algorithm. First, they used constant frequency and loudness, and then they added a mutation operator to increase population diversity. They tested it on image processing and found that the proposed algorithm performs better than the bat algorithm. In another study, the Bat algorithm was hybridization with different evolutionary techniques. This hybridization improved the local search capability of the original bat algorithm. Saha et al. [22] improved the convergence rate of the bat algorithm by interpreting the numerical concept and tested it against several criteria. The simulation results showed that their method increases the convergence rate and accuracy of the bat algorithm. Yılmaz et al. [23] improved the exploration mechanism of the original bat algorithm. They modified the equation of loudness and pulse rate of emission in bat algorithm. They tested this modified bat algorithm on 15 different benchmark functions and concluded that the modified bat algorithm performs better than the bat algorithm. In addition, Li and Zhou also developed the exploration mechanism of the bat algorithm by introducing a scale of complex coding value into the bat algorithm [24]. They separately updated the real and imaginary parts of the complex coding to increase population diversity. To tackle high-dimension problems, a type of algorithm called the bat algorithm with Gaussian walk was developed by Cai et al. [25]. They improved local search capability by introducing a Gaussian walk instead of a uniform random walk. They also modified the equation of updating the bat algorithm’s speed, which led to population diversity. This approach extends the search dimension. The dense bat algorithm developed in[26] by Dao et al. is suitable for hardware resource constrained environments. They replaced the design variable of the search space of the bat algorithm with possible population representation. Their study showed that this method can be used effectively in cases where memory is limited.
Fister et al. [27] developed a self-adapting bat algorithm in which the control parameters, like the case of the differential evolution algorithm, are self-adapting by themselves. They tested it on ten benchmark functions and found that the proposed method could be used effectively in continuous optimization problems. To develop local and global search capability in the Bat algorithm, Jun et al. developed the double-subpopulation variant [28]. They used two subgroups, namely external subgroup and internal subgroup. Global exploitation is improved by external subgroups and local exploitation is improved by internal subgroups. They tested the proposed algorithm on several benchmark functions and concluded that the proposed algorithm outperforms the Bat algorithm. Wang et al. [29] presented an improved version of the bat algorithm; they hybridization of the bat algorithm with differential evolution to select the best solution in the bat population. They used this algorithm in three-dimensional programming problems and concluded that the proposed method outperforms the bat algorithm.
In the most recent research, a new hybrid bat algorithm has been proposed by Liu et al. in 2018 for solving continuous optimization problems [30]. Three modifications are applied to the Bat algorithm to increase local search capability and the ability to avoid getting trapped in local optima. The performance of the proposed method was evaluated with benchmark functions, and the results showed that the modified algorithm performs much better.
In another study, to overcome early convergence and low exploration, directional echolocation was introduced to Bat algorithm to increase its exploration and exploitation capabilities [31]. In addition, three other improvements were incorporated into the Bat algorithm to enhance its performance. The statistical test results showed the superiority of the proposed algorithm. In 2019, an improved bat algorithm was proposed by Purkait et al. [32] for solving optimization problems. The main purpose of this paper is to present and interpret the behavior of the bat algorithm in the form of a modified bat algorithm. This paper also describes the concept, advantages, limitations, and application of the bat algorithm in the different domains.
III. Fundamental Research
1. Bat Algorithm
Yang developed a metaheuristic optimization bat-inspired algorithm in 2010 [19]. This algorithm is based on the echolocation behavior of bats with varying pulse rates of
emission and loudness. Search is augmented by the “random walk” algorithm. The bat algorithm was developed to use the key idea of frequency adjustment based on the echolocation behavior of bats. The bat’s echolocation features in the bat algorithm can be divided into three ideal rules[33]:
Ø All bats use echolocation to sense the distance, and they also recognize the difference between food/prey and obstacles along the way in some magical way.
Ø The bats randomly fly at a speed of vi at a location of xi with a constant frequency fmin of varying wavelengths λ and loudness of A0 to search for prey. They can automatically adjust the wavelength (or frequency) of the emitted pulse and the emitted pulse rate, r , depending on the prey’s proximity.
Although the loudness may vary, we assume that the loudness varies from a positive A0 to a minimum value Amin.
The basic steps of the bat algorithm are summarized in the pseudo-code presented in Figure (1).
Bat Algorithm Objective function f(x), x = (x1 …XD) Initialize the bat population Xi (i = 1, 2... n) and VI Define pulse frequency fi at Xi Initialize pulse rotes and the loudness A While (t <Max number of iterations) Generate new solutions by adjusting frequency, And updating velocities and locations/solutions [equations (2) to (4)] If (rand> r) Select solution among the best solutions Generate a local solution around the selected best solution End if Generate a new solution by flying randomly If (round < A &f (xi) <f(x*)) Accept the new solutions End if Rank the bats and find the current best x* End while Post process results and visualization |
Figure 1: The bat algorithm pseudo-code
For each bat (i) a location xi and speed vi are assumed in a d-dimensional search space, which must be updated subsequently during each iteration. The new solutions of xti and speed vti at walk time t can be calculated by the equations (1), (2) and (3):
(1) | Fi = fmin + (fmax-fmin) β |
(2) | Vti = vt-1 + (xt-1i – x*) fi |
(3) | Xti = xt-1i + vti |
In the above-mentioned equations, β in the range [0, 1] is a random vector derived from a uniform distribution. Here x* is the best global solution to be found so far after comparing all solutions to all n bats in the current interaction. The λifi is speed boost. We can use both fi (and λi) to adjust the speed while keeping other factors λi (or fi) constant, depending on the problem.
2. GWO
GWO algorithm[20] is inspired by the life of gray wolves in the nature; the wolves have a particular hierarchy, i.e. alpha, beta, delta and gamma wolves as shown in Figure (2).
Figure 2: Optimization Algorithm Hierarchy
As shown in Figure (2), the alpha is at the top of the pack and is mainly responsible for deciding on hunting, sleeping place, time to wake, and so on. The alpha’s decisions are dictated to the pack; however, some kind of democratic behavior has also been observed, in which an alpha follows the other wolves in the pack. The betas are subordinate wolves that help the alpha in decision-making or other pack activities. In fact, beta plays the role of an adviser to the alpha and discipliner for the pack. The beta reinforces the alpha’s commands throughout the pack and gives feedback to the alpha. The lowest ranking gray wolf is omega. The omega plays the role of scapegoat. The omega always has to submit to other dominant wolves. They are the last wolves that are allowed to eat. If a wolf is not an alpha, beta, or omega, he/she is called a subordinate or delta. Deltas have to submit to alphas and betas, but they dominate omega. Scouts, sentinels, elders, hunters and caretakers belong to this category. Scouts are responsible for watching the boundaries of the territory and warning the pack in case of any danger. The main phases of GWO hunting are as follows:
Ø Tracking, chasing and approaching the prey
Ø Pursuing, encircling, and harassing the prey until it stops moving
Ø Attack towards the prey
In order to mathematically model the social hierarchy of gray wolf, the fittest solution is considered as alpha (α). The second and third best solutions are beta (β) and delta (δ), respectively. The rest of the candidate solutions are assumed to be omega (ω). In GWO, the hunting (optimization) process is guided by α, β and δ. The ω wolves follow these three wolves. Gray wolves encircle the prey during hunting. In order to mathematically model this encircling behavior, we presented equations (4) and (5) [20].
(4) |
| |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
(5) |
|
(6) |
| |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
(7) |
|
| (8) | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| (9) | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| (10) |
(11) |
| |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
=Randint(1, Npop); numel(Idx) = Npop/2 |
(12) |
| |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
= |
pseudo-code (1): Update Population BAT Algorithm | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
01: are new 02: N is Size 03: Idx is index solution in 04: For j=1: N 05: = 06: = 07: IF(): 08: 09: 10: End 11: End
According to the pseudo-code of algorithm 1, we used a greedy method to replace and update solutions for bat algorithm in the proposed method. In this algorithm, Solutions that have been modified or improved by the GWO algorithm are stored in . And, we also store the GWO algorithm solution numbers from the bat algorithm population stored in the Idx variable. Therefore, any solution in the GWO algorithm population is greedily compared to its parent in the bat algorithm population based on the fitness function and if it is better, it will be replaced. The proposed method would have showed as Figure (4) after adding exploration and exploitation processes of GWO algorithms to improve bat algorithm.
Figure 4: pseudo-code of the proposed method
According to the pseudo-code of the proposed method and the explanations given, the flowchart of the proposed method can be shown in Figure (5).
Figure 5: Flowchart of the Proposed Method
As you can see in Figure (5), the new modification method consists of the bat optimization algorithm and the GWO. In fact, in the proposed method, the exploration and exploitation processes of GWO are applied to some of the solutions produced by the bat algorithm. As a result, a part of the population of the bat algorithm is changed by two exploration and exploitation operators of GWO and this new population enters the bat algorithm population when its results are better than that of the exploration and exploitation operators of bat algorithm. By doing so, we will use the results of exploration and exploitation processes of GWO when they obtain better results than the bat algorithm and ensure that modification will lead to improvement. V. Result and Discussion
In this section, we will evaluate the proposed method and other comparative algorithms. Comparative algorithms include the Harmony Search (HS) algorithm, firefly algorithm, bat algorithm, and GWO. Standard mathematical functions called benchmark functions are used to evaluate and investigate all optimization and metaheuristic algorithms. In this section, we used 20 mathematical benchmark functions, summarized in details in Table (1). The performance and efficiency of the proposed method and other comparative algorithms were then tested and evaluated using this standard function in terms of optimization and the results were recorded and displayed in separate graphs.
Table 1: Standard benchmark functions to evaluate the proposed method
In the following, the proposed method will be implemented on different 2D, 4D, 10D, and 20D benchmark functions. The reason for using different dimensions is to show how the algorithm performs in different dimensions. In addition, the minimum value of the proposed method and other algorithms will be determined over several generations and the extent of algorithm optimization will be shown graphically for comparison. Moreover, the proposed method and other algorithms are compared in terms of the best and the worst values of the objective function and the mean of objective function of the whole population and other statistical parameters. The results of implementation of the proposed method and other basic algorithms on eight 2D benchmark functions are shown in figures (6) to (7), respectively.
Figure 6: Comparison of the proposed method with other metaheuristic algorithms implemented on functions 1-4 with two dimensions
Figure 7: Comparison of the proposed method with other metaheuristic algorithms implemented on functions 5-8 with two dimensions
According to the results, the comparison of the proposed method with other metaheuristic algorithms implemented on 2D functions as shown in figures (6) and (7) shows that the proposed method performs very well on 2D optimization functions and has a better performance than the others. It can also be seen that other metaheuristic algorithms perform well on 2D functions; however, their performance decrease with increasing dimensions. The results of implementing the proposed method and other basic algorithms on the three 4D benchmark functions are shown in Figure (8).
Figure 8: Comparison of the proposed method with other metaheuristic algorithms implemented on functions 9-11 with four dimensions
According to the results obtained by comparing the proposed method with other metaheuristic algorithms implemented on the 4D functions in Figure (8), it is can be seen that the proposed method shows relatively stronger performance compared to the other algorithms when implemented on 4D optimization functions. It can also be seen that some algorithms, such as the bat algorithm, lose their performance by increasing the dimensions of algorithms. The results of implementation of the proposed method and other basic algorithms on six 10D benchmark functions are shown in figures (9) and (10).
Figure 9: Comparison of the proposed method with other metaheuristic algorithms implemented on functions 12-15 with ten dimensions
Figure 10: Comparison of the proposed method with other metaheuristic algorithms implemented on functions 16-17 with ten dimensions According to the results obtained by comparing the proposed method with other metaheuristic algorithms implemented on 10D functions in figures (9) to (10), it can be seen that the proposed method has a stronger performance compared to other metaheuristic algorithms implemented on 10D optimization functions. These results show that the performance of the proposed method does not decrease by increasing the number of iterations; however, performance of some algorithms, such as the HS algorithm, decrease with increasing the dimensions of algorithms. The results of implementing the proposed method and other basic algorithms on four 20D benchmark functions are shown in Figure (11).
Figure 11: Comparison of the proposed method with other metaheuristic algorithms implemented on functions 18-20 with 20 dimensions
Table 3: Calculation of Statistical Criteria
Based on the results of the implementation of the proposed method, the HS algorithm [9], Firefly Algorithm [18], bat algorithm[19], and GWO [20] with low, medium, and high dimensions shown in Table (4), it can be stated that the proposed method yielded good results in terms of statistical criteria such as worst value of the objective function, mean and standard deviation. These criteria also show how the proposed method changes all the solutions available in the population and makes them converge towards the objective. The standard deviation of the proposed method shows a good value in most functions, indicating that the proposed method with the hybridization operators of two Bat algorithms and GWO was well able to influence the whole population. Therefore, the better the statistical criteria are, the better the proposed method improves all the solutions available in the population and can show better performance and efficiency. VI. Conclusion and Future Work In this paper, we improved the bat algorithm, which has poor exploration and exploitation, using GWO, which has a robust exploitation and exploration process. In the proposed method, exploration and exploitation processes of GWO are applied to some of the solutions produced by the bat algorithm. As a result, a part of the population of the bat algorithm is changed by two exploration and exploitation operators of GWO and this new population enters the bat algorithm population when its results are better than that of the exploration and exploitation operators of bat algorithm. By doing so, we will use the results of exploration and exploitation processes of GWO when they obtain better results than the bat algorithm and ensure that modification will lead to improvement. The proposed method, HS algorithm, firefly algorithm, bat algorithm, and GWO were evaluated on benchmark functions in different dimensions. The results showed that the proposed method outperformed other algorithms and also exhibited better convergence in most benchmark functions. In addition, the results of the proposed method in terms of statistical criteria showed that the proposed method provided good results in terms of statistical criteria such as worst value of objective function, mean, and standard deviation. These criteria also showed how the proposed method changes all the solutions available in the population and make them converge towards the objective. The proposed method had a good standard deviation value in most functions, indicating that the proposed method with the hybridization of operators of two bat algorithm and GWO was well able to influence the whole population. References 1. Abedi, M. and F.S. Gharehchopogh, An improved opposition based learning firefly algorithm with dragonfly algorithm for solving continuous optimization problems. Intelligent Data Analysis, 2020. 24(2): p. 309-338. 2. Gharehchopogh, F.S. and H. Gholizadeh, A comprehensive survey: Whale Optimization Algorithm and its applications. Swarm and Evolutionary Computation, 2019. 48: p. 1-24. 3. Shayanfar, H. and F.S. Gharehchopogh, Farmland fertility: A new metaheuristic algorithm for solving continuous optimization problems. Applied Soft Computing, 2018. 71: p. 728-746. 4. Amjad, S.S.G., Farhad, A Novel Hybrid Approach for Email Spam Detection based on Scatter Search Algorithm and K-Nearest Neighbors. Journal of Advances in Computer Engineering and Technology, 2019. 5(3): p. 169-178. 5. Khanalni, S. and F.S. Gharehchopogh, A New Approach for Text Documents Classification with Invasive Weed Optimization and Naive Bayes Classifier. Journal of Advances in Computer Engineering and Technology, 2018. 4(3): p. 31-40. 6. Allahverdipour, A. and F. Soleimanian Gharehchopogh, An Improved K-Nearest Neighbor with Crow Search Algorithm for Feature Selection in Text Documents Classification. Journal of Advances in Computer Research, 2018. 9(2): p. 37-48. 7. Majidpour, H. and F. Soleimanian Gharehchopogh, An Improved Flower Pollination Algorithm with AdaBoost Algorithm for Feature Selection in Text Documents Classification. Journal of Advances in Computer Research, 2018. 9(1): p. 29-40. 8. Gharehchopogh, F.S., S.R. Khaze, and I. Maleki, A new approach in bloggers classification with hybrid of k-nearest neighbor and artificial neural network algorithms. Indian Journal of Science and technology, 2015. 8(3): p. 237. 9. Geem, Z.W., J.H. Kim, and G.V. Loganathan, A new heuristic optimization algorithm: harmony search. simulation, 2001. 76(2): p. 60-68. 10. Rabani, H. and F. Soleimanian Gharehchopogh, An Optimized Firefly Algorithm based on Cellular Learning Automata for Community Detection in Social Networks. Journal of Advances in Computer Research, 2019. 10(3): p. 13-30. 11. Gharehchopogh, F.S., H. Shayanfar, and H. Gholizadeh, A comprehensive survey on symbiotic organisms search algorithms. Artificial Intelligence Review, 2019: p. 1-48. 12. Osmani, A., J.B. Mohasefi, and F.S. Gharehchopogh, Sentiment Classification Using Two Effective Optimization Methods Derived From The Artificial Bee Colony Optimization And Imperialist Competitive Algorithm. The Computer Journal, 2020. 13. Hasanluo, M. and F. Soleimanian Gharehchopogh, Software cost estimation by a new hybrid model of particle swarm optimization and k-nearest neighbor algorithms. Journal of Electrical and Computer Engineering Innovations, 2016. 4(1): p. 49-55. 14. Holland, J.H., Genetic algorithms. Scientific american, 1992. 267(1): p. 66-73. 15. Kennedy, J., Particle swarm optimization, in Encyclopedia of machine learning. 2011, Springer. p. 760-766. 16. Storn, R. and K. Price, Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. Journal of global optimization, 1997. 11(4): p. 341-359. 17. Karaboga, D., An idea based on honey bee swarm for numerical optimization. 2005, Technical report-tr06, Erciyes university, engineering faculty, computer engineering department. 18. Yang, X.-S., Firefly algorithm. Nature-inspired metaheuristic algorithms, 2008. 20: p. 79-90. 19. Yang, X.-S., A new metaheuristic bat-inspired algorithm, in Nature inspired cooperative strategies for optimization (NICSO 2010). 2010, Springer. p. 65-74. 20. Mirjalili, S., S.M. Mirjalili, and A. Lewis, Grey wolf optimizer. Advances in engineering software, 2014. 69: p. 46-61. 21. Zhang, J.W. and G.G. Wang. Image matching using a bat algorithm with mutation. in Applied Mechanics and Materials. 2012. Trans Tech Publ. 22. Saha, S.K., et al., A new design method using opposition-based BAT algorithm for IIR system identification problem. International Journal of Bio-Inspired Computation, 2013. 5(2): p. 99-132. 23. Yılmaz, S., E.U. Kucuksille, and Y. Cengiz, Modified bat algorithm. Elektronika ir Elektrotechnika, 2014. 20(2): p. 71-78. 24. Li, L. and Y. Zhou, A novel complex-valued bat algorithm. Neural Computing and Applications, 2014. 25(6): p. 1369-1381. 25. Cai, X., et al., Bat algorithm with Gaussian walk. International Journal of Bio-Inspired Computation, 2014. 6(3): p. 166-174. 26. Dao, T.-K., et al., Compact bat algorithm, in Intelligent Data analysis and its Applications, Volume II. 2014, Springer. p. 57-68. 27. Fister, I., S. Fong, and J. Brest, A novel hybrid self-adaptive bat algorithm. The Scientific World Journal, 2014. 2014. 28. Jun, L., L. Liheng, and W. Xianyi, A double-subpopulation variant of the bat algorithm. Applied Mathematics and Computation, 2015. 263: p. 361-377. 29. Wang, G.-G., H.E. Chu, and S. Mirjalili, Three-dimensional path planning for UCAV using an improved bat algorithm. Aerospace Science and Technology, 2016. 49: p. 231-238. 30. Liu, Q., et al., A novel hybrid bat algorithm for solving continuous optimization problems. Applied Soft Computing, 2018. 73: p. 67-82. 31. Chakri, A., et al., New directional bat algorithm for continuous optimization problems. Expert Systems with Applications, 2017. 69: p. 159-175. 32. Purkait, G., et al., An Improved Bio-inspired BAT Algorithm for Optimization, in Progress in Advanced Computing and Intelligent Engineering. 2019, Springer. p. 241-248. 33. Asghari Agcheh Dizaj, S. and F. Soleimanian Gharehchopogh, A New Approach in Software Cost Estimation by Improving Genetic Algorithm with Bat Algorithm. Journal of Computer & Robotics, 2018: p. 31-44.
|