A Q-Learning Proposal for Tuning Genetic Algorithms in Flexible Job Shop Scheduling Problems
Keywords:Genetic Algorithm, Flexible Job Shop Scheduling Problems, Q-Learning, tuning
Genetic algorithms (GAs) belong to the category of evolutionary algorithms and are frequently utilized for resolving challenging combinatorial problems. However, they typically require customization to suit a particular problem type, and their performance is heavily influenced by numerous hyperparameters and reproduction operators. In this work, we propose a Reinforcement Learning approach for fine-tuning Genetic Algorithms in Flexible Job Shop Scheduling problems (FJSP), where the main parameters involved in the genetic algorithm operators are trained to allocate the most promising values. The approach returns an optimized schedule taking into account given constraints specific to the scenario, such as the relationship among release date, due date, and processing time, which machines must be selected out of a set of alternative machines, or which sequence-dependent setup time can be filtered.
The approach takes input data in the form of FJSP instances by varying the numbers of jobs and machines and then uses the NSGA-II algorithm to generate solutions. These solutions are stored in a Solutions module and they are analyzed using a Principal components analysis (PCA) to identify clusters of similar instances and solutions. The Q-Learning module then generates hyperparameters for each iteration of the NSGA-II algorithm based on information from the previous modules. A toy example is presented to better understand the behavior of the proposal and the results obtained for optimizing further instances of the problem in a more efficient way.
How to Cite
Copyright (c) 2023 Christian Perez, Carlos March, Miguel A. Salido
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.