Deep Deterministic Policy Gradient-based Energy Efficiency Optimization Algorithm for MEC Networks
Although Mobile Edge Computing(MEC)technology can provide users with data process-ing services,the computing resources of MEC servers are also limited.Therefore,reasonable migration of tasks from users to MEC servers and reasonable allocation of resources from MEC servers to users based on task requirements are key factors to improve user energy efficiency.To solve this problem,a Deep Deterministic Policy Gradient-based Energy Efficiency Optimization(DDPG-EEO)algorithm is proposed.Under the premise of meeting with the time delay requirements,the optimization problem about the maximum energy efficiency of task offloading rate and resource allocation strategy is estab-lished.Then,the optimization problem is described as Markov Decision Process(MDP)and solved by Deterministic Policy Gradient-based.The simulation results show that the DDPG-EEO algorithm re-duces the energy consumption of UTs and improves the task accomplishment rate.
mobile edge computingtask offloadingresources allocationreinforcement learningdeep deterministic policy gradient