Machine Learning and Deep Learning: Simulation with Python – kpga.py


I’ve rewritten the code for genetic algorithms in Chapter 3 of “Machine Learning and Deep Learning: Simulation with Python” (kpga.py) to make it more understandable.

Environment

  • Python 3.6.5 Anaconda

Code

I’ve changed variable and constant names significantly. Additionally, I’ve modified the init_parcel function to automatically generate items without needing a test file. Furthermore, in the eval_fit function, I’ve added output for the total weight along with the fitness.

The explanation of the eval_fit function in the book referred to it as the fitness calculation function, but fitness here refers to the total value. In the following code, I return both the total weight and the value as fitness.

There are still areas where further changes can be made, such as encapsulating the items into a class, but this is a basic implementation.

Keep in mind that the combinations of items are generated multiple times, and there’s no guarantee that the score will always improve from the previous generation. However, the average score tends to increase over generations.

Since this algorithm progresses randomly rather than in a planned manner, there may be cases where the ng_pool contains only one candidate due to specific numeric conditions. In such cases, it may lead to an error termination. If there are no candidates at all, it will be impossible to exit the loop.