Abstract
Convolutional Neural Networks (CNNs) have achieved remarkable results in many applicationfields. However, these CNNs have a large number of network parameters, therebyconsuming a lot of computation and storage resources. This makes CNNs unable to beeffectively applied to these platforms with limited storage and computation resources. Toaddress this issue, this paper proposes a new compact convolution module called DSCGhost-Conv, which combines the advantages of both depthwise separable convolution(DSC) and Ghost convolution module (Ghost-Conv). DSC-Ghost-Conv replaces the standardconvolution used in the Ghost convolution module with depthwise separable convolutionso as to reduce resource costs of the Ghost convolution module. DSC-Ghost-Convcan be used as a plug-and-play component to implement ordinary convolutional layers intypical CNNs such as VGG-16, ResNet-50 and GoogleNet. Experimental results on theMNIST and CIFAR-10 datasets show that implementing the ordinary convolutional layersof CNNs with DSC-Ghost-Conv not only obtains the competitive performance to typicalCNNs, but also greatly reduces the number of network parameters and floating point operations(FLOPs) of CNNs. This demonstrates that the proposed DSC-Ghost-Conv can effectivelyreduce the resource costs of CNNs.