Dependency-Based Task Offloading Scheme in VEC Using Deep Reinforcement Learning
With the increasing demand for low-latency vehicular applications such as automatic driving and aug-mented reality(AR),real-time task offloading has become a new challenge for Internet of Vehicle(IoV)users.Many offloading schemes overlook the dependencies between computational tasks or within tasks themselves,partic-ularly those arising from AR applications.This oversight leads to excessive computation delays and higher offloading failure rates when tasks are offloaded to edge servers.Additionally,due to the dynamic nature of factors such as ve-hicle mobility,task information,and available server resources,offloading schemes need to be adjusted in real-time according to the environmental state.To address these issues,we propose a dependency-based task offload-ing scheme using Deep Reinforcement Learning(DRL).In the Vehicular Edge Computing(VEC)framework,this scheme models dependent tasks as directed acyclic graphs and uses matrices to represent the dependencies among subtasks;The optimization problem,which aims to minimize computation delay,is formulated as a Markov Deci-sion Process(MDP).The DRL algorithm,Deep Q Network(DQN),is employed to solve the state transition prob-abilities in the MDP and determine the offloading decisions.Simulation experiments evaluate the performance of this scheme against existing schemes in terms of computation delay and offloading failure rates.The results demonstrate that this scheme can complete tasks within their deadlines while effectively reducing computation delays and offload-ing failure rates.