To solve the problem that the low training speed and low resource utilizing of distributed neural network training in heterogeneous environment,a heterogeneous-aware parameter server with distributed neural network training(H-PS)was pro-posed.The resources of each worker were fully utilized by dynamically scheduling tasks based on the current status of the workers so that the workers completed their tasks at the same time.A pipeline scheme was proposed to further improve the effectiveness of workers by continuously model training of workers during the time of parameters transmitting between parameter server and workers.A flexible quantization scheme was proposed to reduce the communication overhead between the parameter server and workers by compressing the parameters of neural network model.An emerging container cluster for experiments was used.Experimental results indicate that the proposed H-PS can reduce the overall training time by 1.4x-3.5x when compared with existing methods.