"Genetic Algorithms (GA), are meta heuristic algorithms inspired by the process of natural selection and belong to a larger class of evolutionary algorithms (EA)."
-- (From Wikipedia, the free encyclopedia)
This repository implements a genetic algorithm (GA) in Python3 programming language, using only Numpy and Joblib as additional libraries. The basic approach offers a "StandardGA" class, where the whole population of chromosomes is replaced by a new one at the end of each iteration (or epoch). More recently, the new "IslandModelGA" class was added that offers a new genetic operator (MigrationOperator), that allows for periodic migration of the best individuals, among the different island populations.
NOTE: For computationally expensive fitness functions the StandardGA class provides the option of parallel evaluation (of the individual chromosomes), by setting in the method run(..., parallel=True). However, for fast fitness functions this will actually cause the algorithm to execute slower (due to the time required to open and close the parallel pool). So the default setting here is "parallel=False". Regarding the IslandModelGA, this is running in parallel mode by definition.
NEWS: Several new genetic operators have been added, such as: PositionBasedCrossover(POS), PartiallyMappedCrossover (PMX) and OrderCrossover (OX1). These operators were added to address combinatorial problems where the genome can become invalid by the application of the other standard operators. Additionally, the Boltzmann Selector has been implemented where the individuals, that will form the new population, are selected using a temperature controlled Boltzmann distribution.
The current implementation offers a variety of genetic operators including:
-
Selection operators:
-
Crossover operators:
-
Mutation operators:
-
Migration operators
-
Meta operators
(NOTE: Meta operators call randomly the other operators (crossover/mutation/migration) from a predefined set, with equal probability.)
Incorporating additional genetic operators is easily facilitated by inheriting from the base classes:
and implementing the basic interface as described therein. In the examples that follow I show how one can use this code to run a GA for optimization problems (maximization/minimization) with and without constraints. The project is ongoing so new things might come along the way.
There are two options to install the software.
The easiest way is to visit the GitHub web-page of the project and simply download the source code in zip format. This option does not require a prior installation of git on the computer.
Alternatively one can clone the project directly using git as follows:
git clone https://github.com/vrettasm/PyGeneticAlgorithms.git
The recommended version is Python 3.10 (and above). To simplify the required packages just use:
pip install -r requirements.txt
The most important thing the user has to do is to define the "fitness function". A template is provided here, in addition to the examples below.
from pygenalgo.genome.chromosome import Chromosome
# Fitness function <template>.
def fitness_func(individual: Chromosome, f_min: bool = False):
"""
This is how a fitness function should look like. The whole
evaluation should be implemented (or wrapped around) this
function.
:param individual: Individual chromosome to be evaluated.
:param f_min: Bool flag indicating whether we are dealing
with a minimization or maximization problem.
"""
# CODE TO IMPLEMENT.
# Assign the estimated value.
f_val = ...
# If we want minimization we return the negative.
return -f_val if f_min else f_val
# _end_def_
Once the fitness function is defined correctly the next steps are straightforward as described in the examples.
Some optimization examples on how to use these algorithms:
Problem | Variables | Objectives | Constraints | Description |
---|---|---|---|---|
Sphere | M (=5) | 1 | no | serial |
Rastrigin | M (=5) | 1 | no | serial |
Rosenbrock | M (=2) | 1 | 1 | serial |
Binh & Korn | M (=2) | 2 | 2 | serial |
Sphere | M (=10) | 1 | no | parallel |
Easom | M (=2) | 1 | no | parallel |
Traveling Salesman Problem | M (=10) | 1 | yes | serial |
N-Queens puzzle | M (=8) | 1 | yes | parallel |
OneMax | M (=50) | 1 | no | serial |
Tanaka | M (=2) | 2 | 2 | serial |
Zakharov | M (=8) | 1 | no | serial |
Osyczka | 6 | 2 | 6 | parallel |
Constraint optimization problems can be easily addressed using the Penalty Method. Moreover, multi-objective optimizations (with or without constraints) can also be solved, using the weighted sum method, as shown in the examples above.
This work is described in:
- Michail D. Vrettas and Stefano Silvestri (2024). "PyGenAlgo: a simple and powerful toolkit for genetic algorithms". (Submitted for publication at Journal SoftwareX / under review).
For any questions/comments (regarding this code) please contact me at: [email protected]