Flexible Job-Shop Scheduling Method Based on Deep Reinforcement Learning
Affected by the dynamic disturbance of the workshop,a single scheduling rule cannot consistently obtain good scheduling results in the shop scheduling problem.To this end,a scheduling method based on dueling double DQN(D3QN)is proposed in this paper to solve the flexible job-shop scheduling problem.Firstly,by transforming the scheduling problem into Markov decision process,a mathematical model of reinforcement learning task was constructed,and 18 state features of production system,9 scoring actions for evaluating machines and jobs,and reward functions related to scheduling objectives are designed respectively.Then,based on dueling double DQN algorithm,during the interaction of machine agent and job agent and workshop production system,the two agents are continuously trained to select the machine and job with the highest score at each scheduling decision-making time,so as to complete the resource allocation task of jobs and machines.Finally,through simulation experiments,the proposed method is compared with the scheduling method which directly selects the machine tool number and selects the scheduling rules.The results show that this method can obtain better scheduling results.
Deep reinforcement learningFlexible job-shop schedulingNeural networksDeep Q-networkReward function