Particle swarm optimization (PSO) techniques is a population-based stochastic optimization algorithm which has been developed at Purdue, Indiana in 1995 [264,265,266]. This optimization technique uses genetic approaches as well as heuristic ones to describe and map social behavior of for instance bird flocking and fish schooling. These kinds of behavior are used for optimization purposes to search for a optimal state. The PSO algorithm uses similar evolutionary techniques for computation as for instance GA. For a PSO run, the system of interest is initialized by a set of initial states (initial guess). As a difference to the pure genetic algorithm, PSO has no genetic operators to build its child-parameter sets as for instance crossover and mutation operators do. Where the genetic algorithm provides the information of all individuals for the genetic operators, PSO provides this type of information only for a selected subset of the best individuals. Nevertheless, the PSO algorithm describes the behavior of an individual according to the best individuals which are determined by a certain fitness function. Contrary to the mapping of the evolutionary algorithm where life and populations are described, the PSO tries to characterize the social behavior of a population and the collective behavior of each individual of the population [267,268,269]. For instance, a flock or swarm of birds that is searching randomly for food in a certain area organize themselves to obtain a good strategy to find food for each individual as fast as possible. The obvious best answer to that question is to follow that individual which is nearest to food. Thus, with each iteration, the solution of each individual becomes more optimal because each individual is nearer to the food as before [270].
The PSO algorithm can learn from these types of scenaria and use them to find optimal solutions for other tasks. In this case, the individual is called 'bird' and represents a single solution for the optimization problem. Each individual consists of a certain amount of different characteristic parameters (genomes and properties). These individuals can be assumed to be members of a certain population of the optimization algorithm and hence called particle of a swarm. More abstract synonyms are parameter set, design, or solution.
The PSO method learns from the different applied scenaria and uses them to solve the current optimization problem. In PSO, each solution (``bird'') is a single point in the parameter space. Each particular bird has a fitness value or score value which is evaluated to optimize the quality of the current solution or how far the bird is away from the food. A characteristic parameter of such a bird could be the position and velocity which directs the trajectory of the bird (particle) towards the food (optimum). Thus, the particles fly through the parameter space by following the particles which have the best solutions.
The PSO algorithm is initialized with a set of randomly chosen solutions. Then,
the PSO method searches for optimal parameters by updating the
current particle generation by using evolutionary and social algorithms.
In every iteration, each particle of the population
is updated using two different ``best'' values: gbest and pbest.
The first ``best'' value gbest is the globally best solution which has
been achieved so far during the entire optimization.
The second one, pbest, is the best solution within a certain environment
of the current particle. The environment can be specified by a distance or by a
topological description. Hence, pbest represents the current local
optimum in the current local environment of the particle, where gbest
is the current global optimum of the optimization task.
After determining the two characteristic ``best'' values of the population, the
particle with its current occurrence
is updated using
(4.25) | |||
pbestgbest | (4.26) |