ICV Task Offloading and Resource Allocation Based on Hybrid Deep Reinforcement Learning
With the development of Intelligent Connected Vehicle(ICV)technology,ICVs with limited computing resources face the problem of significantly increased computational demand.ICVs can offload tasks to Mobile Edge Computing(MEC)servers via Roadside Units(RSU).However,the dynamic and complex nature of vehicular networks makes task offloading and resource allocation highly challenging.In this paper,it is proposed to minimize task computing energy consumption by controlling task offloading decision,communication power,and computing resource allocation under environmental and resource constraints.To address the coexistence of discrete and continuous control variables in the problem,a Hybrid Deep Reinforcement Learning(HDRL)algorithm is de-signed.The algorithm employs the Double Deep Q-Network(DDQN)to generate task offloading decisions and the Deep Deterministic Policy Gradient(DDPG)to determine communication power and MEC resource allocation.Fur-thermore,an Improved Prioritized Experience Replay(IPER)mechanism is integrated to evaluate and select ac-tions,outputting the optimal strategy.Simulation results show that the method achieves faster and more stable deci-sion convergence than comparative algorithms,minimizes the energy consumption for task computation offloading,and effectively adapts to changes in the number of ICVs and task sizes,demonstrating high real-time performance and excellent environmental adaptability.
mobile edge computingdeep reinforcement learningtask offloadingresource allocationpriority experience replay