Analysis of Accelerating Training of Distributed Models through Intra Network Model Parameter Distribution
This paper a downlink communication optimization scheme based on random rounding,with an algorithm implementation approximation coefficient of O(logIVI),where V is the number of programmable switches.The large-scale experimental results through algorithm simulation show that compared with state-of-the-art solutions,this scheme can reduce downlink communication overhead by 14.5%to 35.8%.
distributed model trainingintra network computingmulticast transmission