Block-grained domain adaptation for neural networks at edge
Running deep neural networks on edge devices faces two challenges:model scaling and do-main adaptation.Existing model scaling techniques and unsupervised online domain adaptation tech-niques suffer from coarse scaling granularity,limited scaling space,and long online domain adaptation time.To address these two challenges,this paper proposes a block-grained model scaling and domain adaptation training method called EdgeScaler,which consists of offline and online phases.For the model scaling challenge,in the offline phase,blocks are detected and extracted from various DNN and then are converted into multiple derived blocks.In the online phase,based on the combination of blocks and the connections between them,a large-scale scaling space is provided to solve the model scaling problem.For the domain adaptation challenge,a block-specific residual Adapter is designed,which is inserted into the blocks in the offline phase.In the online phase,when a new target domain arrives,all adapters are trained to solve the domain adaptation problem for all options in the block-grained scaling space.Test results on the real edge device,Jetson TX2,show that EdgeScaler can reduce the domain adaptation training time by an average of 85.14%and reduce the training energy consumption by an average of 84.1%,while providing a large-scale scaling option.
deep neural networkedge deviceelastic scalingblockdomain adaptation