The current edge-based collaborative inference strategy for deep neural network(DNN),only focuses on opti-mizing the latency of latency-sensitive tasks,without considering the inference energy cost of energy-sensitive tasks,as well as the efficient unloading problem between heterogeneous edge servers after DNN division.Based on this,an improved deep deterministic policy gradients(DDPG)'s edge-based DNN collaborative inference strategy is proposed,which comprehensively considers the sensitivity of tasks to latency and energy consumption,and then comprehensively optimizes the inference cost.This strategy separates the DNN partitioning from the computation unloading problem.Firstly,predictive models are established for different collaborative devices to predict the optimal partitioning point and the comprehensive inference cost of collaborative inference DNN.Then,based on the predicted comprehensive inference cost,a reward function is established,and the DDPG algorithm is used to formulate the unloading strategy for each DNN inference task,thereby achieving collaborative inference.Experimental results demonstrate that compared to other DNN collaborative inference strategies,this strategy makes more efficient decisions in complex DNN collaborative inference environments,reducing the average inference latency by 46%,the average inference energy consumption by 44%,and the average inference comprehensive cost by 46%.