Circulatory System Based Optimization (CSBO): an expert multilevel biologically inspired meta-heuristic algorithm

ABSTRACT The optimization problems are becoming more complicated, requiring new and efficient optimization techniques to solve them. Many bio-inspired meta-heuristic algorithms have emerged in the last decade to solve these complex problems as most of these algorithms may be trapped into local optima and could not effectively solve all types of optimization problems. Hence, researchers are still trying to develop new and better optimization algorithms. This paper introduces a novel biologically-based optimization algorithm called circulatory system-based optimization (CSBO). CSBO is modeled based on the function of the body’s blood vessels with two distinctive circuits, i.e. pulmonary and systemic circuits. The proposed CSBO algorithm is tested on a wide variety of complex functions of the real world and validated with the standard meta-heuristic algorithms. The results indicate that the CSBO algorithm successfully achieves the optimal solutions and avoids local optima. Note that the source code of the CSBO algorithm is publicly available at http://www.optim-app.com/projects/csbo.


Introduction
Optimizing the design process is one of the most critical topics most engineers and inventors consider. A typical design can be optimized if its parameters are selected appropriately. Optimization is a mathematical tool that selects the best decision from the available set of possible solutions to achieve the ideal goal. After defining the problem variables, it is considered a function called the objective function. Physical conditions are then displayed as constraints on the problem. Then, the optimal solution is obtained by solving the resulting model using optimization methods (Radosavljević, 2018).
In the above equation, F(x) represents the objective function of a typical optimization problem, x is the vector of variables with D dimension, i.e. the algorithm inputs. g i (x) and h j (x) vectors are the inequality and equality constraints of the problem, and n and m parameters are the number of the inequality and equality constraints, respectively.
Optimization problems from different application perspectives can be divided into the following different categories: • Static and dynamic: If the objective function varies with time, the optimization problem is dynamic; otherwise, it is static. • Constrained and unconstrained: A constrained optimization problem involves variables restricted to a specific set or constraint, and an unconstrained optimization problem concerns variables that are not restricted. • Linear or non-linear: The objective function and constraints of linear optimization problems are linear functions of the design variables. • Discrete and continuous: A discrete optimization problem has certain discrete control variables. On the other hand, in a continuous problem, the values of the variables are continuous. • Random and non-random: In an optimization problem, depending on whether the variables are real, binary, or random values, the problem is divided into different categories of true, binary, or random. • Single and multi-objective: A problem can have multiple objective functions. Each of them requires the setting of specific control parameters.
The optimization problems' methods are divided into four categories: counting, computational, innovative, and meta-heuristic methods. As with dynamic programming, counting expands the scope of the search space in each iteration because there is only one point in the domain space of the objective function at a time. Computational methods interface applied mathematics, statistics, information science, and engineering. It emphasizes analytically intractable problems of classical and quantum computations. Researchers are interested in developing computational and mathematical methods to quantify, analyze, and understand recent applications in more complicated systems. Unlike the mathematical methods with a mathematical basis and the convergence of algorithms, the innovative and meta-heuristic methods are proven (where the problem is convex) and may not have proper convergence. Suppose the objective function is not convex or concave in solving optimization problems. In that case, there is almost no mathematical algorithm to guarantee the optimal global solution. Therefore, different optimization methods and a new stochastic algorithm have been established. In these methods, a close and acceptable optimal solution is obtained in a limited and acceptable time. These methods are random and inspired by nature and physical processes. Due to the drawbacks of the mathematical methods, evolutionary methods have been proposed for optimization problems. Among them, the following algorithms have been introduced. Finally, but certainly not least, developing novel algorithms is a prominent area of research for many researchers. Evolutionary events, the collective behavior of species (swarm intelligence approaches), physical laws, and human-related concepts can all serve as inspiration for developing a new algorithm. Indeed, the inspired meta-heuristic algorithms are classified into four subclasses as a primary classification (Mirjalili, 2016): • Natural evolutionary processes or occurrences inspire evolutionary algorithms (EAs): such as using the behavior of flora for artificial flora algorithm (AFA) (L. Cheng et al., 2018), plant intelligence (Akyol & Alatas, 2017), virulence optimization algorithm (VOA), and an optimization strategy inspired by the ideal process through which viruses attack bodily cells (Jaderyan & Khotanlou, 2016), evolution strategies (ESs), a subclass of nature-inspired direct search (Rechenberg, 1989). Genetic algorithms (GAs) are search heuristics inspired by Charles Darwin's theory of natural evolution (Holland, 1992). An opposition-based high dimensional optimization algorithm (OHDA) unique feature is its angular movement in response to very few samples, enabling successful search in high dimensions (GhaemiDizaji et al., 2020). An artificial infectious disease algorithm via the SEIQR epidemic model (G. Huang, 2016) aims to show the relationship of an infectious disease to an optimization algorithm. Mouth brooding fish algorithm (Jahani & Chizari, 2018) models organisms' symbiotic interaction tactics to live and reproduce in an environment and finds the optimal answer by using mouth brooding fish movement, dispersion, and protection patterns. Colonial competitive differential evolution (CCDE), which is based on mathematical modeling of sociopolitical evolution (Ghasemi et al., 2016), tree growth algorithm (TGA) that replicates the fight for food and light among trees (Cheraghalipour et al., 2018), invasive tumor growth (ITGA) achieved by abnormal cells detaching from the tumor bulk because of a decrease in or complete lack of intercellular adhesion molecules (Tang et al., 2015), slime mould algorithm (SMA) that inspired via the foraging and diffusion conduct of slime mould , and invasive weed optimization (IWO) that mimics weed colony's behavior (Mehrabian & Lucas, 2006).

• Swarm Intelligence Algorithms (SIAs) development:
Simulating natural patterns and behaviors in nature is one of the primary goals of SIAs. Fitness dependent optimizer (FDO), which simulates the behavior of the bee swarm in order to locate better colonies (Abdullah & Ahmed, 2019), lion optimization algorithm (LOA) that inspired by lions' unique lifestyle and social behavior (Yazdani & Jolai, 2016), monarch butterfly optimization (MBO) that inspired via idealizing and simplifying the travel of monarch butterflies , yellow saddle goatfish algorithm, an optimization model motivated by yellow saddle goatfish hunting behavior involving chaser and blocker fish (Zaldívar et al., 2018), Aquila optimizer (AO) that inspired via the Aquila's strategies in nature during the process of catching the prey , sailfish optimizer (SO) influenced by a group of hunting sailfish (Shadravan et al., 2019), moth search algorithm (MSA) that motivated via the Lévy flights and phototaxis of the moths (Wang, 2018), ant colony optimization (ACO) (Dorigo & di Caro, 1999), kidney-inspired algorithm (KA) making a novel population-based algorithm informed by the human kidney mechanism (Jaddi et al., 2017), artificial hummingbird algorithm (AHA) that mimics the intelligent foraging behaviors and flight skills of hummingbirds (Zhao et al., 2022), a naturalistic approach to Harris hawks optimization (HHO) (Heidari et al., 2019), artificial ecosystem-based optimization (AEO), a populationbased optimizer that mimics three unique traits of live organisms (Zhao et al., 2020), Chameleon swarm algorithm (CSA) that mimics the dynamic skills of chameleons when hunting and navigating for food sources (Braik 2021), crow search algorithm (CSA) motivated by crows' social smart tendency for hiding food (Askarzadeh, 2016), cooperation search algorithm (CSA) taking cues from modern business teamwork (Feng et al., 2021), grasshopper optimization algorithm (GOA) based on the natural foraging and swarming activity of grasshoppers , COOT algorithm motivated by the dynamic behavior of the population of birds (Naruei & Keynia, 2021), black widow optimization algorithm (BWOA) prompted by black widow spider mating rituals (Hayyolalam & Kazem, 2020), JAYA algorithm a gradient-free optimization technique (R. Rao, 2016), tunicate swarm algorithm (TSA), which mimics tunicate navigation and foraging turbojet engines and swarming behavior (Kaur et al., 2020), chimp optimization algorithm (COA) influenced by chimps' individuality and sexual motivation (Khishe & Mosavi, 2020), pity beetle algorithm (PBA) based on a beetle's aggregation habit (Kallioras et al., 2018), emperor penguin optimizer (EPO) resembles emperor penguin huddling (Dhiman & Kumar, 2018), ludo game-based met-heuristics (P. R. Singh et al., 2019), which uses two or four players to imitate the game ludo to update distinct swarm intelligent characteristics, Fox optimization algorithm (RFO) which uses a mathematical model of red fox hunting, developing population, searching for food, and habits (Połap & Woźniak, 2021), galactic swarm optimization (GSO) via motion between stars (Muthiah-Nakarajan & Noel, 2016), parasitism-predation algorithm (PPA) to tackle the challenges of low convergence and the constraint of dimensionality of enormous data (A.-A. A. Mohamed et al., 2020), earthworm optimization algorithm (EWA) via the butterfly adjusting operation and migration operation , Barnacle mating habits in nature served as inspiration for BMO (Sulaiman et al., 2020), a biological-inspired optimization algorithm named squirrel search algorithm (SSA) (Jain et al., 2019), colony predation algorithm (CPA) based on animals to avoid enemies (Tu et al., 2021), wild geese algorithm (WGA) which natural life and death in the wild is its basis (Ghasemi et al., 2021), hunger games search (HGS) which mimics the behavioral choice and hunger-driven activities of animals , bald eagle search optimization algorithm (BESO) an innovative, based on bald eagles' hunting tactics or social conduct when searching for fish (Alsattar et al., 2020), phasor particle swarm optimization (PPSO) based on a phasortheoretic model of particle design variables with a phase angle (Ghasemi et al., 2019), elephant herding optimization (EHO) which mimics the herding behavior of elephants (Wang et al., 2015), buttery optimization algorithm (BOA) that imitates the natural foraging and mating activities of butterflies (Sharma et al., 2021). • Physics-Inspired algorithms (PIAs): Yin-Yang-pair optimization (YYO) algorithm via physical event or specific tool (Punnathanam & Kotecha, 2016), an algorithm by Franklin's and Coulomb's laws theory, i.e. the CFA optimizer (Ghasemi et al., 2018), gradient-based optimizer (GBO) which uses Newton's approach to investigate the search domain using a number of vectors and two major operators (Ahmadianfar et al., 2020), electromagnetic field optimization (EFO) that the behavior of electromagnets with varying polarities and a natural ratio called the golden ratio is its basis (Abedinpourshotorban et al., 2016), weIghted meaN oF vectOrs (INFO) which uses a solid structure and updating the vectors' position (Ahmadianfar et al., 2022), wind driven optimization (WDO) algorithm which updates the velocity and position of wind-controlled air parcels regarding the physical equations that control air motion (Bayraktar et al., 2013), Lévy flight distribution (LFD) (Houssein et al., 2020), Equilibrium optimizer (EO) a revolutionary optimization technique for the implementation of control volume mass balance models (Faramarzi et al., 2020), simulated annealing (SA) a method involves metalworking process of heating and cooling a material to change its physical qualities (Kirkpatrick et al., 1983), supernova optimizer (SO) motivated by supernova phenomena (Hudaib & Fakhouri, 2018), dynamic differential annealed optimization (DDAO) (Ghafil & Jármai, 2020), henry gas solubility optimization (HGSO) encouraged by Henry's law (Hashim et al., 2019), artificial chemical reaction optimizer (ACRO), which is designed to be inspired by chemical reactions (Alatas, 2011), water evaporation optimization (WEO) that simulates the evaporation of water molecules on a solid surface with varying wettability (Kaveh & Bakhshpoori, 2016), rain-fall optimization based on behavior of raindrops (Kaboli et al., 2017), gases Brownian motion optimization (GBMO) motivated by gas Brownian movement and turbulent rotational motion (Abdechiri et al., 2013), atom search optimization (ASO) basis of the basic of molecular dynamics (Zhao et al., 2019), turbulent flow of waterbased optimization (TFWO) fascinated by whirlpools formed in turbulent water flow (Ghasemi et al., 2020), thermal exchange optimization (TEO) based on Newton's law of cooling (Kaveh & Dadras, 2017), heat transfer search (HTS) which the law of thermodynamics and heat transfer are its basis (Patel & Savsani, 2015), RUNge Kutta algorithm (RUN) which the law of the Runge Kutta (RK) method and the mathematical foundations (Ahmadianfar et al., 2021), weighted superposition attraction (WSA) in which agents create a superposition that causes other solution vectors to follow (Baykasoğlu & Akpinar, 2017), and gravitational search algorithm (GSA) based on mass exchanges and gravity (Rashedi et al., 2009). • Human/social-related Algorithms (HSAs): political optimizer (PO) replicating the human political process (Askari et al., 2020), a very optimistic method (Vommi & Vemula, 2018) employing two factors; luck and effort, future search algorithm (FSA) which mimics the person's life (Elsisi, 2019), volleyball premier league algorithm (VPLA) that works through interacting and competing among volleyball teams (Moghdani & Salimifard, 2018), path planning algorithm (PPA) to find a sequence of valid configurations (Zhou et al., 2017), pathfinder algorithm (PA) that tries to solve the graph theory's shortest path problem (Yapici & Cetinkaya, 2019), teaching-learning-based optimization (TLBO) algorithm that examines a teacher's impact on students (R.V. Rao et al., 2011), imperialist competitive algorithm (ICA) suggests an optimization method influenced by imperialism (Atashpaz-Gargari & Lucas, 2007), collective decision optimization (CDO) that human social behavior based on decision-making traits is its basis (Q. Zhang et al., 2017), and queuing search algorithm (QSA) which is stimulated from human doings in queuing process (J. Zhang et al., 2018).
As mentioned above, many optimization algorithms have been proposed recently. As stated by no free lunch theorems (Wolpert & Macready, 1997), the real-world deal with a wide range of complex problems with different objective functions. Therefore, the inherent nature of algorithms may have the best functionality for several functions. However, it suffers from performing well for several other problems. No single algorithm can effectively solve a wide variety of real-world problems, which is the fundamental reason for presenting these new algorithms. For example, the problems addressed in CEC 2014 functions consist of 30 different test functions: unimodal, simple multimodal, hybrid, and composition. In CEC 2014, it has attempted to design these 30 test functions to cover a wide range of real-world problems, such as engineering design and economic load dispatch in small to large-scale systems. Therefore, researchers need an algorithm that can cover a broader range of problems to rank excellently for every 30 test functions. On the other hand, time is a significant factor in many optimization problems. Therefore, a new algorithm that is powerful and robust and has a reasonable speed to reach an acceptable optimal solution is required. Besides, each optimization algorithm has several control parameters that the user must determine their values according to his experience, which is sometimes time-consuming and questionable. Therefore, a simple and robust algorithm is required to give the optimal and reasonable result with lower control parameters. In other words, the need for a comprehensive and robust algorithm is well felt in all branches of science. This article introduces an algorithm called CSBO that can achieve these goals.
The CEC 2005, CEC 2014, and CEC 2017 standard test functions are utilized in this paper to demonstrate the effectiveness of the CSBO. These standard test functions cover a wide range of functions, such as multimodal, multimodal, and hybrid functions. We compare the results with several modern and standard algorithms at each optimization stage to show the algorithm's performance.
Briefly, the advantages of the proposed algorithm can be listed below: • A new meta-heuristic algorithm inspired by regular body function • Ability to perform effectively for a wide range of realworld functions • Having a special competitive performance compared to modern and standard algorithms The remainder of this article is summarized as follows: the conceptual and mathematical formulations of the CSBO algorithm are introduced in Section 2. In Section 3, the performance of the proposed algorithm is benchmarked on real-world test functions. The application of the proposed algorithm for solving some engineering optimization problems with complicated search spaces is investigated in Section 4. Finally, Section 5 represents some essential conclusions.

Circulatory System Based Optimization (CSBO) algorithm
Modeling is one of the most important branches of engineering that is a good way to study the behavior of a system (Ghasemi et al., 2021). Models are representations of different systems. With the help of the model, the effect of different factors on the system can be simulated. Of course, models must predict the behavior of different systems and functions of each problem in different conditions.

Regular circulatory system
This article presents a new powerful optimizer using the circulatory system function model. The heart is a fantastic organ that pumps oxygen and nutrient-rich blood through the human body to sustain life. The heart is an essential part of the cardiovascular system, called the circulatory system. It contains all elastic, muscular tubes (vessels) that carry blood from the heart to the body and back to it. Blood is vital for the body. In addition to carrying fresh oxygen from the lungs and nutrients to the body's tissues, it also eliminates the body's waste products, including carbon dioxide, away from the tissues. This circulation is necessary to sustain life and promote the health of all parts of the body.
According to the simple inspiration model from the circulatory system of the body's regular performance in Figure 1, the body's blood vessels are functionally divided into two distinctive circuits: the pulmonary circuit and the systemic circuit. The pump for the pulmonary circuit, which circulates blood through the lungs, is the right ventricle. The left ventricle is the pump for the systemic circuit, which provides the blood supply for the body's tissue cells. Pulmonary circulation transports oxygenpoor blood from the right ventricle to the lungs, where the blood picks up a new blood supply. Then it returns the oxygen-rich blood to the left atrium.
Blood is considered a Newtonian fluid in most cases. The main variables of the circulatory system are flow, pressure, and volume. Pressure-flow modeling of the circulatory system can be examined from two perspectives, beating and non-beating, which is considered in a model inspired by the beating perspective.
Arteries and veins are considered cylindrical vessels whose walls have elastic properties. To stimulate blood flow in most models, the momentum continuity , known as the Navier-Stokes equations Figure 1. A simple inspiration model from the circulatory system for modeling CSBO ('Pixabay.com,' n.d.). (Johnston et al., 2006), is used with the assumption of constant density and viscosity. The Newtonian fluid can be expressed in the following general form regardless of several gravitations.
Where ρ is the fluid density, v is the velocity vector, P is the pressure, t represents time, and τ is the stress tensor.
The systemic circulation provides the functional blood supply to all body tissue. It carries oxygen and nutrients to the cells and picks up carbon dioxide and waste products. Systemic circulation carries oxygenated blood from the left ventricle, through the arteries, to the capillaries in the body's tissues. From the tissue capillaries, the deoxygenated blood returns through a system of veins to the right atrium of the heart. It then moves into the right ventricle, and the above cycle is repeated, equivalent to one iteration in our proposed algorithm.
In this algorithm, we have modeled two pulmonary and systemic circuits as two separate groups with two different optimization cycles. They are equivalent to specific functions modeled on a specific type of population. Here, this process of the circulatory system is equivalent to the generation of a more substantial population and elimination of a weaker population in the optimization algorithm. The mathematical modeling of the CSBO optimization process is explained in the following sections.

Circulatory system regular performance as an intelligent systematic algorithm: CSBO
In the mathematical modeling of new meta-heuristic optimization algorithms, many hypotheses based on the inspiration of the phenomenon may be considered. In this section, we briefly explain how to model the circulatory system function as an optimizer and implement the proposed CSBO algorithm.
In the CSBO algorithm, like any other meta-heuristic optimization algorithm, at first, an initial population is generated based on a random function within the problem range, which here represents the mass of blood droplets. The position of the blood droplets represents the possible solutions to an optimization problem in the search space, and the circulatory system acts as an operator of this population to refine and strengthen them and eliminate the weaker population. In other words, the solution (blood) quality in the search space (body) is improved throughout an iterative process based on the functionality of the blood's circularity system in the body.
In the proposed algorithm, the pulmonary circulation deals with deoxygenated blood, which is equivalent to the weaker population, and the systematic circulation deals with oxygenated blood, which is equivalent to the population with a better target value. In other words, it deals with a better population. The ith blood mass (BM i ) (or ith individual of the population in CSBO) will move based on its position. In other words, it will be directed to a more optimal position; otherwise, it will maintain its current position. Figure 2 shows how the evolutionary process of the blood in the circulatory system equivalently can be modeled as an optimizer system. Also, Table 1 shows, in detail, how the elements or functions of the circulatory system are modeled in the proposed CSBO algorithm.

The mathematical modeling of the CSBO algorithm
At first, the CSBO algorithm, like any other metaheuristic algorithm, starts with an initial population or blood masses BM i = (Bm i,1 , Bm i,2 , . . . , Bm i,D ) for a typical problem with the number of dimensions D (d = 1:D), which randomly generates between the minimum BM min = (Bm min,1 , Bm min,2 , . . . , Bm min,D ) and maximum BM max = (Bm max,1 , Bm max,2 , . . . , Bm max,D ) values of the problem parameters range as follows: This initial population, as mentioned earlier, plays the same role as blood particles or masses in the body.

Movement of blood mass in the veins
The ith blood mass in the veins, BM i , moves based on the imposed force or pressure. The mass always moves in a direction that has more optimal conditions. Therefore, the value of its objective function (amount of force or pressure) decreases. We can model the clogged arteries in the heart as trapping in locally optimal solutions. We like this situation not to happen like in the real world. As the body continues to work, the algorithm will continue its optimization process well. This step of the circulatory cycle is modeled based on the particle positions and their objective function values as follows: In fact, K ij determines the direction of movement of the ith blood mass (BM i ) in the arteries. p i is a value between 0 and 1 and depends on the problem dimensions. It determines the amount of displacement and moves toward a better value in each circulation cycle.

Population or blood mass flow in pulmonary circulation
As mentioned earlier, the pulmonary circulation deals with deoxygenated blood, equivalent to the weaker population in optimization. In fact, in the CSBO, at each iteration, the population is sorted, and NR numbers of the weakest population enter the pulmonary circulation and are directed to the lungs to gain oxygen In (7), randn denotes the random normal number, it indicates the current algorithm iteration, randc indicates the random vector from the Cauchy probability distribution, and D is the number of the optimization problem dimension. The pulmonary circulation also changes the p i for this population as follows:

Population or blood mass flow in systematic circulation
As mentioned, NR numbers of the weakest sorted population enter the pulmonary circulation. The rest of the population (NL = Npop-NR) that have a better fit value enter the systematic circulation with a new amount in order to circulate through the body, as modeled below: The systematic circulation also corrects the p i for this group of population as follows: where F Worst and F Best are the worst and best values of cost function obtained until the current iteration. The cycle of optimization will be continued for the specified number of iterations. Similar to other metaheuristic algorithms, each member of the population will accept the new position if it gets a better value of the fitness function.
The CSBO algorithm pseudo-code is summarized in Algorithm 1.

The competitive study based on the standard real-world benchmark functions: CEC 2005, CEC 2014, and CEC 2017 benchmarks
In order to confirm the functionality of our proposed inspired approach, named CSBO algorithm, a test bed of well-known 14 CEC 2005, 30 CEC 2014 and 29 CEC 2017 benchmarks functions is employed in the following experiments.

CEC2005 benchmark functions
In this section, the first fourteen CEC 2005 popular test functions (Y. Wang et al., 2011) have been utilized to . a ij , b ij are random integer numbers in the range [−100,100], α is a random number from [−π,π ].
[−π, π] D Real parameter expanded multimodal test functions: Shifted Expanded Griewank's plus Rosenbrock's Function show the performance of the proposed algorithm. These functions, which have been used in a wide variety of optimization articles in recent years, are shown in Table 2.

Number of population in CSBO algorithm
This section selects five different populations for our algorithm and then runs the optimization function. These populations are 30, 45, 60, 75 and 90. The same computer did all the simulations 30 times per test function and 300,000 function evaluations (NFEs) (Y. Wang et al., 2011) with 30 dimensions. The simulation results with NR = 15 are given in Table 3. In the last column, three indexes are shown in which Nb (the number of the best results) is the number of tries that the algorithm obtained the best results in comparison with other studied algorithms, Nw (the number of the worst results) is the number of tries that the algorithm obtained the worst results compared to other studied algorithms and Mr (the mean rank) is the average rank of all obtained tries among all the test functions. According to this Table 3, the algorithm can perform well with different populations. For example, the population of 45 could be an appropriate choice for D = 30. We chose this number for the rest of our work.

Effect of NR on CSBO performance analysis
In this section, we consider different values of NR, i.e. 5, 10, 15, 20 and 30, to assess the algorithm's effectiveness. All the simulations were done by taking 45 populations. From the results, it is evident that different values of NR    Table 4, NR = Npop/3 is an appropriate choice for our algorithm. R denotes the algorithm ranking in the corresponding function compared to other algorithms in Table 4, in which 1 and 5 indicate the best and worst rank, respectively, and this is true for all parts of this paper.
In Figure 3, the summary of the CSBO convergence curves has been shown based on the results given in Table 4 to optimize CEC2005 functions. As it is clear from these curves, the CSBO has a decent convergence speed for most functions. According to the figures, although the convergence characteristic of NR = 5 for the F9 curve has the highest convergence rate, its final solution is inferior compared to other NR values. On the other hand, NR = 15 has the best convergence characteristic. Moreover, the convergence rates for the NR = 10 and 30 and even NR = 5 are also acceptable.

Comparison with other algorithms
This section proposes the test functions obtained by the CSBO algorithm compared with other standard algorithms in related articles, as shown in Table 5 with NFEs: 3.00E + 05 for D = 30 and 500,000 for D = 50. The parameter setting of some competitor algorithms, in this case, is given in Table 6. In this table, the plus sign (+) represents that other algorithms outperform the offered CSBO, the minus sign (−) denotes that other algorithms underperform the proposed CSBO, and the equal sign ( = ) shows the same functionality.
We compared our algorithm with GL-25 (global and local real-coded genetic algorithms based on parentcentric crossover operators) (Y. Wang et al., 2011), HRCGA (real-coded genetic algorithm) (C. Li et al., 2011), EPSDE (an ensemble of trial vector generation approaches and control parameters of DE) (Y. Wang et al., 2011), SaDE (DE with strategy adaptation) (Y. Wang et al., 2011), SLPSO (self-learning particle swarm optimizer) (C. Li et al., 2011), APSO (an adaptive PSO) (C. Li et al., 2011) and FIPS or FIPSO (a fully informed PSO) (C. Li et al., 2011). Although the proposed algorithm is basic, it can conquer other algorithms in the same  circumstance, which shows its effectiveness as a novel solution for optimization problems. The '-', '+ ', and ' = ' denote that the performance of the corresponding algorithm is worse than, better than, and similar to that of CSBO, respectively. From Table 5, it can be found that SaDE and EPSDE generally have similar performance after CSBO. FIPS algorithm has the worst performance among all in this table. In addition, PSO and GA have the same average ranking. Moreover, SaDE has the most comparable performance to CSBO in the test functions of F6 and F14. It should be noted that SaDE is an enhanced evolved algorithm, while CSBO is the first version of its kind.

CEC 2014 benchmark functions
This section investigates the results of implementing the CSBO on CEC 2014 benchmark functions.

CSBO initial evaluation in comparison with original classical algorithms
In order to verify the performance of the proposed CSBO algorithm compared to other algorithms, we have selected the CEC2014 functions in this section. These are real-world modeled optimization functions. CEC 2014 functions (unimodal, simple multimodal, hybrid, and composition benchmark tests) (J. J. Liang et al., 2013) have been used successfully in many recent articles and, therefore, in this paper to test the performance of the proposed CSBO algorithm, we have used them. These functions are described (Liang et al., 2013).
In this section, we have selected two dimensions, 30 and 50, with 30 runs for each test function. The number of evolutions in all parts of the article is 300,000 and 500,000 for two dimensions, 30 and 50, respectively. Also, Npop and NR for these two dimensions are set at 45 and 15 and 60 and 20, respectively. The parameter setting of some of the competitor algorithms is given in Table 7.
The simulation results for D = 30 are given in Table 8, compared to the robust and modern algorithms in (X. Chen et al., 2017) e.g. LDWPSO (linearly decreasing inertia weight PSO), FIPSO or FIPS, BLPSO (biogeography-based learning PSO), RCBBOG (real code biogeography-based optimization with Gaussian mutation), GL-25, and GBABC (Gaussian barebones artificial bee colony). Both tables clearly show that the proposed CSBO algorithm has defeated all other algorithms for most functions (F1, F2, F3, F4, F13, F15, F17, F18, F20, F F21, F22, F24, F25, F26, F28, and F30). In addition, increasing the dimension of the functions could not significantly affect CSBO. Interestingly, the proposed algorithm never had the worst performance and rank for both dimensions among the algorithms, indicating the robustness and reliability of the CSBO. Nevertheless, to mention that the two algorithms, GBABC and BLPSO, overcame the CSBO for 7 and 8 test functions of D = 30, respectively, which is normal.  Table 4.  Examining the table results, we find that although CSBO has obtained inappropriate rankings (rank 4) for the three functions 9, 27 and 29, it has a partial difference with the best mean result. For example, for function 29, the best mean result is 1.02E+ 03 obtained by the ultramodern genetic algorithm GL-25, while this result for CSBO is 1.34E+ 03, which is acceptable. Although CSBO did its worst performance with a rank of 5 for function 7, it got the best solution for 16 test functions. Based on the last three rows of the table, we can consider it a strong and appropriate emerging algorithm. Table 9 contains the simulation results for the LDW-PSO, FIPSO, BLPSO, RCBBOG, GL-25, and GBABC in (X. Chen et al., 2017) and CSBO (this study) algorithms for D = 50. According to this table, the Mr value for a dimension size of 30 is 1.8, while the value for a dimension of 50 is 2.0333. It demonstrates that increasing the dimension reduces the performance of CSBO. Furthermore, it shows that the performance of CSBO partially decreases by increasing the dimension. Nevertheless, it still has the first ranking with the best functionality for half of the test functions without any worst results. (13) Each member in the initial population is assigned a mutation strategy and parameter values randomly selected from the respective pools. The mutation strategy and parameter values produce better offspring survive while those who fail to produce better offspring are reinitialized.

SaDE
The F values are randomly generated with a mean and standard deviation of 0.5 and 0.3. The mutation strategy and the parameter CR are self-adapted based on their previous performance.

A comparison of CSBO with the state-of-the-art PSO algorithms (PSOs)
In this section, to better demonstrate the performance of the proposed CSBO, we compare our results with a novel example of PSO called promotional particle swarm  The results for dimension 30 are given in Table 10 based on the mean value and standard deviation. The winning sign is defined by +, indicating the CSBO has won over the competitor algorithm in that function. In addition, the minus sign -and equity sign = show the failure and the same performance of CSBO compared to the corresponding algorithm. Total statistical results are given in Figures 4 and 5. As can be seen from the results of the first three unimodal functions and simple multimodal function F4, CSBO results are significant and superior to PSOs. On the other hand, the performance of the proposed algorithm for the last three test functions, F28, F29 and F30, which are composition test functions, is average and relatively poor compared to PSOs. CSBO scored the worst ranking of 7 for these three test functions.
Looking at Figure 4, where S r is the sum of the total rank of the algorithm, CSBO, with an average rank of 2.6333, is the decisive winner of this comparative study. Its closest algorithm is PPSO, with an average rank of       3.80, which is a significant difference. This Figure shows that FIPS-URing is the weakest algorithm. Figure 5 also lists the top-down algorithms based on the number of functions with the best value or rank 1 (N best ) and the number of times they have the weakest rank (N worst ). CSBO has a rank of 1 for 16 functions, while its nearest competitor, PPSO, ranks 1 in only four functions. In addition, GPSO and CLPSO did not get the best value or worst value for any function. The proposed CSBO, SLPSO, and PPSO algorithms never got the worst value, which is a great advantage for these algorithms. Moreover, although FIPS-Uring obtained the best value for two functions; it got the worst value 14 times, which shows its deficiency. We use Wilcoxon's test to determine whether two algorithms behave significantly differently (Ghosh et al., 2012). The p-values for applying Wilcoxon's test on CSBO and PSOs are shown in Table 11. The p-values less than 0.05 (the significance level) are in boldface. Because of the data, it is clear that CSBO outperforms the other eleven PSO algorithms. Furthermore, although CSBO is not significantly superior to PPSO, it outperforms it on an average ranking basis.
In addition, by looking closely at

A comparison of CSBO with the popular inspired optimization algorithms (IOAs):
In this section, to identify CSBO compared to inspirational algorithms as well as their improved versions, the results of several other algorithms published in recent articles with conditions similar to CSBO with  (Mirjalili, 2015), and LJA (Jaya with Levy flight) (Iacca et al., 2021) In addition, the parameter settings of some of these algorithms are given in Table 13. As in the previous section, CSBO has impressive performance for the first four test functions compared to other algorithms or IOAs. The worst rank obtained, which is a middle rank, is 7 for the test function 12 or Shifted and rotated Katsuura function, which is equal to 5.01E − 01, and the best value and the second-best value for this test function are obtained by SOO + BOBYQA and NRGA, respectively, which are equal to 3.00 E − 02 and 1.51E − 01. On the other hand, the worst solutions obtained by m-SCA, NIWTLBO, LJA and mTLBO were 1.76E + 00, 1.93E + 00, 2.49E + 00 and 2.50E + 00. Given these results, it can be said that the solutions obtained by CSBO are acceptable results. At the same time, it can become a more robust algorithm with some modifications.
Note for Table 12: The '-', '+', and ' = ' denote that the performance of CSBO is worse than, better than, and similar to that of the corresponding algorithm, respectively.
Summary of statistical results from CSBOs and IOAs are given in Figures 6 and 7. A quick look at Figure 6 reveals that CSBO deservedly is the best and most reliable   algorithm. The closest algorithm to CSBO is the OptBees algorithm, with an average value of 3.8667, a difference of 1.8 from the average value of CSBO, which is a considerable difference. WOA, LJA and MFO (three new and trendy algorithms) are the worst rankings, with averages of 10.9667, 11.6667 and 12.1000, which are difficult to accept in a few conditions. On the other hand, looking carefully at Figure 7, it can be seen that CSBO got the best solutions for 14 test functions here and never got the worst ranking. As mentioned, the worst ranking was 7, which was within acceptable limits. The modern algorithms SOO + BOBYQA and DIRECT-L obtained the best solutions for 13 and 8 functions in this study. At the same time, SOO + BOBYQA once and DIRECT-L five times got the worst solutions, which indicates the valuable functionality of our proposed CSBO.
It is clear from the figure that the two algorithms, LJA and MFO, which have never had the best solution, are in the red. Their solution has been the worst for most functions compared to other algorithms.  Npop = 50, the subswarm size was empirically set to 5, the regrouping period P was also empirically set to 5, the sampling size was set to 10, and the range of each hypercube dimension was twice the range of the corresponding dimension of the learners. level) are in boldface. As a result of the data, it is clear that CSBO outperforms other algorithms. Although CSBO is not statistically superior to the rest of the SOO + BOBYQA, it exceeds this on an average ranking basis. Furthermore, this table demonstrates that CSBO significantly outperforms all other algorithms. The most comparable strategy is SOO + BOBYQA, which achieved the     most significant outcomes in 13 situations but failed in 17 cases compared to CSBO.

CEC 2017 benchmark functions
In order to verify the performance of the proposed CSBO algorithm compared to other algorithms, we have selected the CEC2017 functions in this section. These are real-world modeled optimization functions. CEC 2017 functions (unimodal, simple multimodal, hybrid, and composition benchmark tests)  have been used successfully in many recent articles and, therefore, in this paper to test the performance of the proposed CSBO algorithm, we have used them.
In this section, we have selected 30-dimension with 30 runs for each test function. The number of evolutions in all parts of the article is 300,000. Also, the population and NR set at 45 and 15, respectively. The parameter settings of some competitors, in this case, are given in Table 15.
Although CSBO obtained indicates inappropriate rankings (rank 3) for the two functions, 10 and 22, the last three rows of Table 16 reveal that CSBO is a robust and appropriate emerging algorithm.

CSBO complexity
The CSBO method was applied in MATLAB 7.6, and the simulation was performed on a Pentium IV E5200 PC equipped with 2 GB of RAM. The CSBO algorithm was used to evaluate all test functions in the CEC 2014 competition (J. J. Liang et al., 2013). The algorithm was executed 30 times for each test problem for a total number of 10,000 × D function evaluations. The convergence speed of the CSBO algorithm is determined according to the procedure provided in (J. J. Liang et al., 2013). T 0 denotes the execution time of the following scheme in Table 17: T 1 is the time required to compute F18 for 200 000 evaluations, whereas T 2 denotes the time required to run the suggested technique for 200 000 evaluations. T 2 is assessed five times, and the mean of the five evaluations is represented by T 2 . As a final point, the complexity of the algorithm is represented as T 2 , T 1 , and ( T 2 −T 1 )/T 0 .
In addition, the computational cost of CSBO is mainly determined by three processes: blood particle initialization, fitness assessment, and blood particle update. The computing complexity of the initialization procedure is O (Npop), where Npop is the number of blood particles. The updating CSBO has a computational cost of O (Itermax × Npop) + O (Itermax × Npop × D), which is comprised of searching for the optimal position and updating the location vectors of all the blood particles, where Itermax is the

Application of CSBO algorithm for engineering optimization problems
In the second phase study, some experiments were accomplished to compare the proposed CSBO algorithm with other obtained optimal best results for solving various manufacturing parameter optimization problems such as the engineering design optimization, the parameter estimation for frequency-modulated (FM) sound waves, and the maximizing of the reliability in the engineering systems.
Over the last few years, population-based swarm intelligence based on various EAs has attracted much interest among researchers in the related fields for the optimal solutions such as the optimal solutions of various types of manufacturing parameters and engineering design optimization problems in order to improve the system's features like performance and cost. A variety of manufacturing topics can be defined as optimization problems with many nonlinear characteristics and the inequality (or equality) and nonlinear (or linear) optimization constraints. Product and process design, tuning manufacturing parameters, scheduling and production planning are some examples of this area. Therefore, to attain desired product quality with high efficiency, it is urgent to use optimization methods to handle the manufacturing development. Since manufacturing processes are going to be more complicated and also the products' quality must satisfy high standards, the investigation of improved methods for solving these problems is highly explored, and it is still an ongoing subject in the current competitive market (G. Zhang et al., 2013).

The constrained engineering design optimization using CSBO
In order to verify the results of the proposed CSBO algorithm on the constrained engineering design applications, three problems from the competitive study are chosen: optimal design of a tension/compression spring, the three-bar truss and the pressure vessel to minimize the total cost of the manufacturing and design. The population size and the maximum number of iterations have been chosen to be 45, 400 for three-bar truss, and 5000 for pressure vessel and tension/compression spring optimal design problems, respectively, with 30 independent runs for the CSBO algorithm, which are summarized as follows. In recent years, it should be noted that many researchers have examined these engineering problems in different studies. Thus, we just investigated the cases considering all the limitations and conditions.

Problem 1: the tension/compression spring design problem
The tension/compression problem aims to minimize the tension/compression spring's weight regarding constraints on the minimum deflection, shear stress, surge frequency, diameter, and design variables, as shown in Figure 8 (Arora, 2004). This problem includes one linear and three nonlinear inequality constraints and three continuous design variables. In addition, tre are three continuous variables, including the wire diameter x 1 (d) with region 0.05 ≤ x 1 ≤ 2, the mean coil diameter x 2 (D) with region 0.25 ≤ x 2 ≤ 1.3, and the number of active coils x 3 (P) with region 2 ≤ x 3 ≤ 15 (Akay & Karaboga, 2012;Coello & Montes, 2002).
In fact, CSBO could obtain the standard deviation of 4.38E −14, which is an outstanding result. Although other algorithms could result in an optimal global solution, they all have the worse Std. compared to CSBO.
optimization (BBO) (Kaveh & Eslamlou, 2020), (Simon, 2008) algorithms. The results indicate that the CSBO algorithm is quite competitive and effective for optimal design of the three-bar truss structure. It is worth mentioning that this is a simple problem with only two dimensions and could be solved easily by most algorithms.

Problem 3: the pressure vessel optimization problem
In the pressure vessel optimal design, as shown in Figure 10, the optimization problem objective function is to minimize the total cost (F 3 (X)), containing the material cost, forming and welding of a cylindrical vessel structure. There are two discrete design variables, the thickness of the shell (x 1 or T s ) with region 1 ≤ x 1 ≤ 99 and the thickness of the head (x 2 or T h ) with region 1 ≤ x 2 ≤ 99, in which the design variables x 1 and x 2 are integer multiples of 0.0625. In addition, two continuous design variables, including the inner radius (x 3 or R ) with region 10 ≤ x 3 ≤ 200, and the length of the cylindrical section of the vessel, not including the head (x 4 or L ) with region 10 ≤ x 4 ≤ 200 are the other variables (Q. He & Wang, 2007b) and (Brajevic & Tuba, 2013). Therefore, the optimal design problem can be expressed as follows (Q. He & Wang, 2007b): Figure 10. The pressure vessel optimal design problem.
where R s is the reliability of different systems, F(.) and g (.) are the objective and constraint functions for the RRAO problem of the overall parallel-series systems, respectively. The g(.) is usually associated with system cost, volume and weight. r = (r 1 , r 2 , . . . , r m ) and n = (n 1 , n 2 , . . . , n m ) are the component reliabilities and redundancy allocation number vectors for system subsystems, including m subsystems, respectively, and l is the system resource limitation.

A real-world example: the maximizing of the reliability of the over-speed protection system of a gas turbine
The over-speed detection has an important role in mechanical and electrical systems. When an over-speed happens, it is vital to halt the fuel source by utilizing a few control valves (V 1 to V 4 ). The over-speed protection system of a gas turbine for RRAP optimization mixedinteger non-linear problem is depicted in Figure 11. The input parameters of the over-speed protection system are summarized in Table 23 (T.-C. Chen, 2006). This reliability optimization (maximization) problem can be formulated as follows: 1 ≤ n d ≤ 10, ∈ Z + (positive integer in the discrete space). (33) The system constraints include: (1) The combination of weight, volume and redundancy allocation constraints: where, v d is the volume of d th subsystem for all components, V is the upper volume limit of the subsystem's products.
(2) The system cost constraint: where, C parameter is the upper-cost limit of the system and C(r d ) is the cost for all components with reliability rd at dth stage. T is the operating time in which the components are working.
(3) The system weight constraint: In Table 24, the best results obtained from the CSBO algorithm with Iter max = 3000 and the population size of Npop = 45 compared to many previous reported works are given. It is obvious from this table that the CSBO algorithm outperforms in comparison with other Figure 11. The diagram block for the over-speed protection system of a gas turbine (T.-C. Chen, 2006;Ghavidel et al., 2018).

Discussions and prospect of the future
On the CEC 2005, CEC 2014 and CEC 2017 standard benchmarks and five popular real-world engineering issues, the proposed CSBO algorithm is compared to other well-known nature-inspired algorithms.
The statistical analysis of the benchmark functions demonstrates that this method can produce promising and competitive outcomes. Additionally, it was discovered that CSBO is capable of performing well in exploration and exploitation in real-parameter (shifted) multimodal and enlarged multimodal functions, as well as in real-parameter unimodal functions. Additionally, the findings of the real-parameter composite and hybrid functions demonstrate that the CSBO strikes an appropriate balance between exploration and exploitation. Additionally, CSBO's average optimized outcomes and standard deviation on average results are comparable to those generated by other optimization methods. Convergence speed comparisons further demonstrate the provided algorithm's rapid convergence capability. It would be fascinating to apply CSBO to other optimization issues in many science and engineering sectors in the future. Numerous study directions can be offered for future works. First, we investigate various spirals' influence Second, a binary implementation of CSBO may be an intriguing future project. Finally, it is recommended to provide certain operators for solving multiobjective algorithms utilizing CSBO. Another intriguing issue would be conducting additional research on the NR parameter value to decide it automatically without user control.

Conclusion
In this article, we presented a new meta-heuristic optimization algorithm inspired by the functionality of the circulatory system in the human body named the Circulatory System Based Optimization (CSBO) algorithm. The performance and mathematical modeling of CSBO and its functionality as an optimizer were presented. CSBO was tested and optimized on a wide variety of complex real-world functions compared with many wellknown optimization algorithms. Various test functions, including unimodal, multimodal, hybrid, and composition standard benchmarks of the CEC 2005, CEC 2014 and CEC 2017 were used to test the performance of the proposed algorithm. The results showed the higher performance of CSBO compared to the state-of-the-art algorithms in terms of exploration, exploitation, local optima avoidance, and convergence manner. Also, a dimensional scalability analysis was conducted for CSBO, including 30 and 50 dimensions of the CEC2014, and the results indicated that CSBO could efficiently search the feasible space to find the optimal or near-optimal solutions. Finally, CSBO was applied to several different practical engineering problems. These engineering problems are included: the tension/compression spring design, the three-bar truss design, the pressure vessel design, the parameter estimation for FM sound waves, and the reliability-redundancy allocation optimization. The simulation results indicated that CSBO is quite competitive and robust for optimal design problems compared with many modern and advanced optimization algorithms in the recent literature. Therefore, CSBO can be considered a modern and robust algorithm for future studies and optimization applications.

Disclosure statement
No potential conflict of interest was reported by the author(s).

Funding
This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (No. NRF-2022R1A4A3032838) and in part by the Chung-Ang University Research Grants in 2021.